Mar 20 17:16:28 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 17:16:29 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 17:16:29 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 17:16:30 crc kubenswrapper[4803]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 17:16:30 crc kubenswrapper[4803]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 17:16:30 crc kubenswrapper[4803]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 17:16:30 crc kubenswrapper[4803]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 17:16:30 crc kubenswrapper[4803]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 17:16:30 crc kubenswrapper[4803]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.562724 4803 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569263 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569297 4803 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569307 4803 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569315 4803 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569324 4803 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569333 4803 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569343 4803 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569352 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569359 4803 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569367 4803 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569375 4803 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569383 4803 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569390 4803 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569398 4803 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569406 4803 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569414 4803 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569422 4803 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569430 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569437 4803 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569445 4803 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569453 4803 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569461 4803 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569479 4803 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569487 4803 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569495 4803 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569504 4803 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569512 4803 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569520 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569552 4803 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569560 4803 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569567 4803 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569576 4803 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569584 4803 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569591 4803 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569599 4803 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569606 4803 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569614 4803 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569622 4803 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569630 4803 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569639 4803 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569646 4803 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569654 4803 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569662 4803 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569672 4803 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569683 4803 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569691 4803 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569701 4803 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569708 4803 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569716 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569724 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569731 4803 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569739 4803 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569747 4803 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569758 4803 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569766 4803 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569777 4803 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569787 4803 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569796 4803 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569808 4803 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569816 4803 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569824 4803 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569832 4803 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569842 4803 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569851 4803 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569860 4803 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569868 4803 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569875 4803 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.569883 4803 feature_gate.go:330] unrecognized feature gate: Example Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.570296 4803 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.570379 4803 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.570389 4803 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570661 4803 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570693 4803 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570708 4803 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570722 4803 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570735 4803 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570745 4803 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570757 4803 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570769 4803 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570787 4803 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570798 4803 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570809 4803 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570820 4803 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570829 4803 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570839 4803 flags.go:64] FLAG: --cgroup-root="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570912 4803 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570927 4803 flags.go:64] FLAG: --client-ca-file="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570941 4803 flags.go:64] FLAG: --cloud-config="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570962 4803 flags.go:64] FLAG: --cloud-provider="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570973 4803 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.570992 4803 flags.go:64] FLAG: --cluster-domain="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571003 4803 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571017 4803 flags.go:64] FLAG: --config-dir="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571027 4803 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571043 4803 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571059 4803 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571079 4803 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571092 4803 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571104 4803 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571116 4803 flags.go:64] FLAG: --contention-profiling="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571130 4803 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571141 4803 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571153 4803 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571164 4803 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571179 4803 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571203 4803 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571215 4803 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571226 4803 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571238 4803 flags.go:64] FLAG: --enable-server="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571249 4803 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571265 4803 flags.go:64] FLAG: --event-burst="100" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571277 4803 flags.go:64] FLAG: --event-qps="50" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571288 4803 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571310 4803 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571322 4803 flags.go:64] FLAG: --eviction-hard="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571336 4803 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571348 4803 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571360 4803 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571371 4803 flags.go:64] FLAG: --eviction-soft="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571384 4803 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571396 4803 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571407 4803 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571427 4803 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571438 4803 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571450 4803 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571460 4803 flags.go:64] FLAG: --feature-gates="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571474 4803 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571485 4803 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571497 4803 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571509 4803 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571569 4803 flags.go:64] FLAG: --healthz-port="10248" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571584 4803 flags.go:64] FLAG: --help="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571596 4803 flags.go:64] FLAG: --hostname-override="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571607 4803 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571620 4803 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571632 4803 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571644 4803 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571658 4803 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571671 4803 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571692 4803 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571707 4803 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571720 4803 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571732 4803 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571744 4803 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571756 4803 flags.go:64] FLAG: --kube-reserved="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.571768 4803 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572048 4803 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572079 4803 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572086 4803 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572096 4803 flags.go:64] FLAG: --lock-file="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572570 4803 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572606 4803 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572621 4803 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572652 4803 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572662 4803 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572674 4803 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572692 4803 flags.go:64] FLAG: --logging-format="text" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572704 4803 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572716 4803 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572726 4803 flags.go:64] FLAG: --manifest-url="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572743 4803 flags.go:64] FLAG: --manifest-url-header="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572763 4803 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572774 4803 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572791 4803 flags.go:64] FLAG: --max-pods="110" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572803 4803 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572813 4803 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572823 4803 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572833 4803 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572845 4803 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572855 4803 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572869 4803 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572909 4803 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572920 4803 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572930 4803 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572941 4803 flags.go:64] FLAG: --pod-cidr="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572952 4803 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572972 4803 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572983 4803 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.572993 4803 flags.go:64] FLAG: --pods-per-core="0" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573003 4803 flags.go:64] FLAG: --port="10250" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573013 4803 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573024 4803 flags.go:64] FLAG: --provider-id="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573034 4803 flags.go:64] FLAG: --qos-reserved="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573044 4803 flags.go:64] FLAG: --read-only-port="10255" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573054 4803 flags.go:64] FLAG: --register-node="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573064 4803 flags.go:64] FLAG: --register-schedulable="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573074 4803 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573095 4803 flags.go:64] FLAG: --registry-burst="10" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573105 4803 flags.go:64] FLAG: --registry-qps="5" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573114 4803 flags.go:64] FLAG: --reserved-cpus="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573123 4803 flags.go:64] FLAG: --reserved-memory="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573136 4803 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573147 4803 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573156 4803 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573166 4803 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573177 4803 flags.go:64] FLAG: --runonce="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573186 4803 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573196 4803 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573206 4803 flags.go:64] FLAG: --seccomp-default="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573216 4803 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573225 4803 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573235 4803 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573246 4803 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573255 4803 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573265 4803 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573275 4803 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573286 4803 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573295 4803 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573306 4803 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573324 4803 flags.go:64] FLAG: --system-cgroups="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573333 4803 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573351 4803 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573361 4803 flags.go:64] FLAG: --tls-cert-file="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573370 4803 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573390 4803 flags.go:64] FLAG: --tls-min-version="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573399 4803 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573474 4803 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573486 4803 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573496 4803 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573506 4803 flags.go:64] FLAG: --v="2" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573552 4803 flags.go:64] FLAG: --version="false" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573570 4803 flags.go:64] FLAG: --vmodule="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573587 4803 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.573603 4803 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.573986 4803 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574001 4803 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574012 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574023 4803 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574031 4803 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574039 4803 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574047 4803 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574058 4803 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574068 4803 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574076 4803 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574122 4803 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574130 4803 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574204 4803 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574213 4803 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574223 4803 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574232 4803 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574239 4803 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574255 4803 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574266 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574277 4803 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574286 4803 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574294 4803 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574302 4803 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574310 4803 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574318 4803 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574326 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574334 4803 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574341 4803 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574350 4803 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574358 4803 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574367 4803 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574379 4803 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574387 4803 feature_gate.go:330] unrecognized feature gate: Example Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574395 4803 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574403 4803 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574412 4803 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574420 4803 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574427 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574435 4803 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574443 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574451 4803 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574458 4803 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574466 4803 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574474 4803 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574484 4803 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574495 4803 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574503 4803 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574512 4803 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574546 4803 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574557 4803 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574565 4803 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574573 4803 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574581 4803 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574588 4803 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574596 4803 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574604 4803 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574612 4803 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574619 4803 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574627 4803 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574639 4803 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574650 4803 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574661 4803 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574671 4803 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574683 4803 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574691 4803 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574700 4803 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574709 4803 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574717 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574725 4803 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574733 4803 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.574745 4803 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.574782 4803 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.589473 4803 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.589542 4803 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589666 4803 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589683 4803 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589693 4803 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589702 4803 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589710 4803 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589721 4803 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589734 4803 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589743 4803 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589754 4803 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589764 4803 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589774 4803 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589782 4803 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589793 4803 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589802 4803 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589811 4803 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589819 4803 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589827 4803 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589836 4803 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589844 4803 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589853 4803 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589862 4803 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589870 4803 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589880 4803 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589888 4803 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589896 4803 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589905 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589913 4803 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589922 4803 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589931 4803 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589939 4803 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589947 4803 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589955 4803 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589963 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589971 4803 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589981 4803 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589989 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.589997 4803 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590005 4803 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590050 4803 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590058 4803 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590066 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590074 4803 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590083 4803 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590091 4803 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590099 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590106 4803 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590114 4803 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590121 4803 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590129 4803 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590137 4803 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590145 4803 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590153 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590161 4803 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590169 4803 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590177 4803 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590184 4803 feature_gate.go:330] unrecognized feature gate: Example Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590191 4803 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590199 4803 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590209 4803 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590220 4803 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590229 4803 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590238 4803 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590246 4803 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590271 4803 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590279 4803 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590287 4803 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590294 4803 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590302 4803 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590310 4803 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590318 4803 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590329 4803 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.590342 4803 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590656 4803 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590673 4803 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590684 4803 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590693 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590702 4803 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590709 4803 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590717 4803 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590725 4803 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590732 4803 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590740 4803 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590748 4803 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590755 4803 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590764 4803 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590772 4803 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590780 4803 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590787 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590795 4803 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590803 4803 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590810 4803 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590818 4803 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590826 4803 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590833 4803 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590841 4803 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590849 4803 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590859 4803 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590870 4803 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590878 4803 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590885 4803 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590894 4803 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590910 4803 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590918 4803 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590926 4803 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590933 4803 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590941 4803 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590950 4803 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590959 4803 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590967 4803 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590975 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590983 4803 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590991 4803 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.590998 4803 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591006 4803 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591013 4803 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591021 4803 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591031 4803 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591041 4803 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591050 4803 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591058 4803 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591066 4803 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591077 4803 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591086 4803 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591095 4803 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591103 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591114 4803 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591124 4803 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591133 4803 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591141 4803 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591150 4803 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591158 4803 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591166 4803 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591174 4803 feature_gate.go:330] unrecognized feature gate: Example Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591182 4803 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591189 4803 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591197 4803 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591205 4803 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591213 4803 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591220 4803 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591229 4803 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591237 4803 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591245 4803 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.591253 4803 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.591264 4803 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.592618 4803 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.602457 4803 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.610699 4803 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.610881 4803 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.613100 4803 server.go:997] "Starting client certificate rotation" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.613157 4803 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.613415 4803 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.641495 4803 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.645994 4803 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.647538 4803 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.667364 4803 log.go:25] "Validated CRI v1 runtime API" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.708113 4803 log.go:25] "Validated CRI v1 image API" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.711247 4803 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.718092 4803 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-17-11-43-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.718168 4803 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:43 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.754277 4803 manager.go:217] Machine: {Timestamp:2026-03-20 17:16:30.750294295 +0000 UTC m=+0.661886435 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:f5fad69f-2b85-49f3-8d02-78bb8556dc89 BootID:ab993cfe-4b68-4c86-8f13-4224b3fe4fdc Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:43 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:8d:b6:fa Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:8d:b6:fa Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c1:da:82 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:c5:78:1a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:65:32:63 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1c:57:4b Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:03:4f:89:df:45 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:be:f6:cc:dc:4a:4f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.754616 4803 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.754785 4803 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.756422 4803 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.756706 4803 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.756765 4803 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.757017 4803 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.757031 4803 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.757425 4803 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.757771 4803 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.758058 4803 state_mem.go:36] "Initialized new in-memory state store" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.758153 4803 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.764661 4803 kubelet.go:418] "Attempting to sync node with API server" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.764704 4803 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.764733 4803 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.764756 4803 kubelet.go:324] "Adding apiserver pod source" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.764778 4803 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.769906 4803 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.770912 4803 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.771548 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.771707 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.771654 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.771805 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.773404 4803 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775371 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775401 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775411 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775469 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775495 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775513 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775543 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775571 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775580 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775590 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775618 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775628 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.775652 4803 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.776312 4803 server.go:1280] "Started kubelet" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.776335 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.777494 4803 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.777680 4803 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 17:16:30 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.779701 4803 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.781908 4803 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.781949 4803 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.782068 4803 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.782115 4803 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.782150 4803 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.784750 4803 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.785610 4803 server.go:460] "Adding debug handlers to kubelet server" Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.787906 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.787981 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.787428 4803 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e9c261f2c8b47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.776257351 +0000 UTC m=+0.687849431,LastTimestamp:2026-03-20 17:16:30.776257351 +0000 UTC m=+0.687849431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.789235 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.789390 4803 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.789411 4803 factory.go:55] Registering systemd factory Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.789428 4803 factory.go:221] Registration of the systemd container factory successfully Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.792682 4803 factory.go:153] Registering CRI-O factory Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.792720 4803 factory.go:221] Registration of the crio container factory successfully Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.792750 4803 factory.go:103] Registering Raw factory Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.792775 4803 manager.go:1196] Started watching for new ooms in manager Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.793893 4803 manager.go:319] Starting recovery of all containers Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.799640 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.799757 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.799791 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.799820 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.799847 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.799876 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800319 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800356 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800389 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800419 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800584 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800614 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800650 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800684 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800714 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800740 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800802 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800832 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800859 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800886 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800913 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800945 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.800973 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801002 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801041 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801071 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801105 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801171 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801233 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801266 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801292 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801318 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801349 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801380 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801407 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801434 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801462 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801488 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801517 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801587 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801615 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801645 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801672 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801701 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801730 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801759 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801788 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801817 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801845 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801872 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.801899 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802387 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802443 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802474 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802506 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802702 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802742 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802772 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802798 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802827 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802854 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802883 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802911 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802939 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802966 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.802996 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803023 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803051 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803078 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803105 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803134 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803161 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803190 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803218 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803248 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803275 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803303 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803333 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803359 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803386 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803413 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803442 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803467 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803495 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803591 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803624 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803652 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803680 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803711 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803738 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803767 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803805 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803832 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803861 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803888 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803916 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803943 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803973 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.803999 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804029 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804055 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804096 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804123 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804149 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804188 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804260 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804294 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804327 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804357 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804391 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804423 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804464 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804492 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804561 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804594 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804627 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804655 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804680 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804707 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804735 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.804763 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.805615 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.805685 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.805701 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.805717 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.805733 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.805750 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.806914 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.806940 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.806957 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.806971 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.806986 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.806999 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807014 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807033 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807047 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807061 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807075 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807088 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807103 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807117 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807135 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807149 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807164 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807179 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807193 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807206 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807219 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807233 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807247 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807266 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807281 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807297 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807312 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807326 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807340 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807354 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807369 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807382 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807483 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807502 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807517 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807553 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807570 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807583 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807596 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807609 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807625 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807638 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807658 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807671 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807684 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807697 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807713 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807726 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807740 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807753 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807767 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807780 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807795 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807807 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807822 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807838 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807852 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807867 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807881 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807893 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807906 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807923 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807938 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807951 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807967 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807979 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.807992 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808007 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808022 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808036 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808053 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808070 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808089 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808105 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808124 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808142 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.808164 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.810611 4803 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.810651 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.810674 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.810689 4803 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.810701 4803 reconstruct.go:97] "Volume reconstruction finished" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.810711 4803 reconciler.go:26] "Reconciler: start to sync state" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.826229 4803 manager.go:324] Recovery completed Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.843970 4803 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.846698 4803 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.846777 4803 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.846818 4803 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.846918 4803 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 17:16:30 crc kubenswrapper[4803]: W0320 17:16:30.849056 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.849145 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.853878 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.855955 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.856004 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.856018 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.857780 4803 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.857819 4803 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.857863 4803 state_mem.go:36] "Initialized new in-memory state store" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.881249 4803 policy_none.go:49] "None policy: Start" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.882358 4803 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.882393 4803 state_mem.go:35] "Initializing new in-memory state store" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.882399 4803 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.947136 4803 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.947420 4803 manager.go:334] "Starting Device Plugin manager" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.949388 4803 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.949436 4803 server.go:79] "Starting device plugin registration server" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.950120 4803 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.950152 4803 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.950318 4803 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.950457 4803 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 17:16:30 crc kubenswrapper[4803]: I0320 17:16:30.950478 4803 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.960327 4803 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:16:30 crc kubenswrapper[4803]: E0320 17:16:30.990801 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.050415 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.052045 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.052163 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.052188 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.052244 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:16:31 crc kubenswrapper[4803]: E0320 17:16:31.053204 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.147887 4803 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.148152 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.150310 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.150394 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.150421 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.150788 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.152586 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.152646 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.152667 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.153760 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.153816 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.153866 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.154001 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.154103 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.155314 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.155359 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.155378 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.155616 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.155837 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.155886 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.155904 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.155958 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.156022 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.156029 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.156056 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.156213 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.159750 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.159800 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.159818 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.159773 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.159919 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.159944 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.160378 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.160587 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.160647 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.161805 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.161854 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.161878 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.162293 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.162326 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.162343 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.162627 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.162677 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.163775 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.163817 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.163835 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215099 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215214 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215268 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215321 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215380 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215437 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215489 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215595 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215683 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215764 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215821 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215908 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.215976 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.216053 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.253609 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.255401 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.255466 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.255495 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.255569 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:16:31 crc kubenswrapper[4803]: E0320 17:16:31.256356 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317368 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317451 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317507 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317586 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317628 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317707 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317692 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317640 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317818 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317887 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317937 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317946 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318003 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.317832 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318017 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318069 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318086 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318121 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318148 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318151 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318227 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318186 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318263 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318283 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318294 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318325 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318328 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318384 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318377 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.318351 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: E0320 17:16:31.392289 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.483360 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.499488 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.529132 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: W0320 17:16:31.537321 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-300d68d4f5a9710352fc789b02e9720ab49cc5713e90ea4eb422e094716915b4 WatchSource:0}: Error finding container 300d68d4f5a9710352fc789b02e9720ab49cc5713e90ea4eb422e094716915b4: Status 404 returned error can't find the container with id 300d68d4f5a9710352fc789b02e9720ab49cc5713e90ea4eb422e094716915b4 Mar 20 17:16:31 crc kubenswrapper[4803]: W0320 17:16:31.543809 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3e8ca98b6ae8a22247359d527dbd317a4258f9559a4b4b2768c196395737f339 WatchSource:0}: Error finding container 3e8ca98b6ae8a22247359d527dbd317a4258f9559a4b4b2768c196395737f339: Status 404 returned error can't find the container with id 3e8ca98b6ae8a22247359d527dbd317a4258f9559a4b4b2768c196395737f339 Mar 20 17:16:31 crc kubenswrapper[4803]: W0320 17:16:31.558281 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-884dda1100420c32c72081d5065428eb7205021bc4246eb1c02bc6eeb7b1e063 WatchSource:0}: Error finding container 884dda1100420c32c72081d5065428eb7205021bc4246eb1c02bc6eeb7b1e063: Status 404 returned error can't find the container with id 884dda1100420c32c72081d5065428eb7205021bc4246eb1c02bc6eeb7b1e063 Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.564197 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.577648 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:31 crc kubenswrapper[4803]: W0320 17:16:31.592650 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7c86177c4c2b17f8b4f3bbd7f9260435c4a432abf57d085651589e213d0d5a6d WatchSource:0}: Error finding container 7c86177c4c2b17f8b4f3bbd7f9260435c4a432abf57d085651589e213d0d5a6d: Status 404 returned error can't find the container with id 7c86177c4c2b17f8b4f3bbd7f9260435c4a432abf57d085651589e213d0d5a6d Mar 20 17:16:31 crc kubenswrapper[4803]: W0320 17:16:31.603161 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a7f8ae1a9078ac98563e2f21fd4a253804f5173434cc58e672b64d8115c8040f WatchSource:0}: Error finding container a7f8ae1a9078ac98563e2f21fd4a253804f5173434cc58e672b64d8115c8040f: Status 404 returned error can't find the container with id a7f8ae1a9078ac98563e2f21fd4a253804f5173434cc58e672b64d8115c8040f Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.657143 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.658973 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.659041 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.659069 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.659117 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:16:31 crc kubenswrapper[4803]: E0320 17:16:31.659957 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 17:16:31 crc kubenswrapper[4803]: W0320 17:16:31.743639 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:31 crc kubenswrapper[4803]: E0320 17:16:31.743768 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.777411 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.856540 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"884dda1100420c32c72081d5065428eb7205021bc4246eb1c02bc6eeb7b1e063"} Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.859174 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"300d68d4f5a9710352fc789b02e9720ab49cc5713e90ea4eb422e094716915b4"} Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.860484 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3e8ca98b6ae8a22247359d527dbd317a4258f9559a4b4b2768c196395737f339"} Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.863162 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a7f8ae1a9078ac98563e2f21fd4a253804f5173434cc58e672b64d8115c8040f"} Mar 20 17:16:31 crc kubenswrapper[4803]: I0320 17:16:31.864694 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7c86177c4c2b17f8b4f3bbd7f9260435c4a432abf57d085651589e213d0d5a6d"} Mar 20 17:16:32 crc kubenswrapper[4803]: E0320 17:16:32.193696 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Mar 20 17:16:32 crc kubenswrapper[4803]: W0320 17:16:32.199975 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:32 crc kubenswrapper[4803]: E0320 17:16:32.200064 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:32 crc kubenswrapper[4803]: W0320 17:16:32.309418 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:32 crc kubenswrapper[4803]: E0320 17:16:32.309587 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:32 crc kubenswrapper[4803]: W0320 17:16:32.368753 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:32 crc kubenswrapper[4803]: E0320 17:16:32.368877 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.460946 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.463471 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.463565 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.463623 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.463668 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:16:32 crc kubenswrapper[4803]: E0320 17:16:32.464315 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.697667 4803 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:16:32 crc kubenswrapper[4803]: E0320 17:16:32.699011 4803 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.777880 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.874915 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"48bd8656a2abeeebf86c34ca3699602e3a1e51785724b7e3831beb49132a284e"} Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.875065 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6137f4411a10c6454e1635ec8d76f38c06b337851b01fcd3ac19033e5e399276"} Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.875098 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"93126eed679cf9381d281ba5f85387fecfe1fa5e889e7302ba7b6378fe1c074a"} Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.878564 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908" exitCode=0 Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.878707 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908"} Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.878770 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.880170 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.880221 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.880245 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.881189 4803 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="dd1e67f4d72204efcc83555233db25a99e521837a4460037fa6d7cdcb4445ff4" exitCode=0 Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.881240 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"dd1e67f4d72204efcc83555233db25a99e521837a4460037fa6d7cdcb4445ff4"} Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.881384 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.882776 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.883389 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.883452 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.883480 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.884233 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.884335 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.884357 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.886317 4803 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82" exitCode=0 Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.886442 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.886423 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82"} Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.888076 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.888131 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.888153 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.889843 4803 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="1e382d02d53bab38428e39f479a361fb27b30fba16d710ef8baad9441a073886" exitCode=0 Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.889919 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"1e382d02d53bab38428e39f479a361fb27b30fba16d710ef8baad9441a073886"} Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.889945 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.891100 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.891159 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:32 crc kubenswrapper[4803]: I0320 17:16:32.891184 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.777502 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:33 crc kubenswrapper[4803]: E0320 17:16:33.795361 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.895121 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.895229 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.895246 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.898875 4803 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ecf5293e910fc120679e36cb974cd048c3f81801062b6908dcf666e0a720b6df" exitCode=0 Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.899040 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ecf5293e910fc120679e36cb974cd048c3f81801062b6908dcf666e0a720b6df"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.899904 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.901730 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.901762 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.901775 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.904808 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.904870 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.906334 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.906387 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.906412 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.908985 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c06b56ac85a0a31c00b7c31353292cd29b416b38f0869de50ca36133974a925c"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.909036 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1a41722077d6cd7c30a9317d7ea554d879bd34b37877019d810a96f28db2dbc7"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.909052 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cd52d19d2b6a6e0f36cba9244884e0166fe7b49a411c618ddabbc99c26ce9732"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.909089 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.910597 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.910697 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.910723 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.915112 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7cfe71cd0b41fbf5aaf308a2e464d15ac10f0164665a19b2a9209971d78245c6"} Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.915215 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.916288 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.916342 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:33 crc kubenswrapper[4803]: I0320 17:16:33.916361 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.065244 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.068027 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.068067 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.068085 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.068115 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:16:34 crc kubenswrapper[4803]: E0320 17:16:34.068578 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 17:16:34 crc kubenswrapper[4803]: W0320 17:16:34.227855 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:34 crc kubenswrapper[4803]: E0320 17:16:34.227964 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:34 crc kubenswrapper[4803]: W0320 17:16:34.358854 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 17:16:34 crc kubenswrapper[4803]: E0320 17:16:34.359004 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.924452 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc"} Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.924559 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ed63ab71099d5d4faefcb62c40b8eb4d348b96b096d6132e1f4bc411313fdc83"} Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.924611 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.925890 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.925932 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.925950 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.928803 4803 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0cb486f3db02849d5e820d4dd4be82dc43a683b3401b1cd4c7f96a6d23a62665" exitCode=0 Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.929119 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.929148 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0cb486f3db02849d5e820d4dd4be82dc43a683b3401b1cd4c7f96a6d23a62665"} Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.929193 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.929206 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.929293 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.929565 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.930869 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.930938 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.930967 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.931625 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.931678 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.931696 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.931825 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.931854 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.931871 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.932843 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.932912 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:34 crc kubenswrapper[4803]: I0320 17:16:34.932935 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:35 crc kubenswrapper[4803]: I0320 17:16:35.937761 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1a3d569a077eabcdaa43c3ad29a9b1e322fdc1d096cbc7263457ab874de5369a"} Mar 20 17:16:35 crc kubenswrapper[4803]: I0320 17:16:35.937832 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f8b96c23695ad5e07ef7337e9883a5dae68a26c5f49c7d23aa94cc97d557c974"} Mar 20 17:16:35 crc kubenswrapper[4803]: I0320 17:16:35.937874 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"72d1d9b73fd8838a863527d9f58b10880a4285dde4a2f60e20ff34ff641913ac"} Mar 20 17:16:35 crc kubenswrapper[4803]: I0320 17:16:35.937837 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:16:35 crc kubenswrapper[4803]: I0320 17:16:35.937959 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:35 crc kubenswrapper[4803]: I0320 17:16:35.939241 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:35 crc kubenswrapper[4803]: I0320 17:16:35.939294 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:35 crc kubenswrapper[4803]: I0320 17:16:35.939311 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.893008 4803 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.950488 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3551a327197fff72a30c766f036ff7e1f2c8faca6ea1ad6f6e2232ef00fccd4f"} Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.950613 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ed1556b57eee98ff727c886d98ad260e3f99225dd411de79206633887dd016eb"} Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.950732 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.952184 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.952248 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.952268 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.978808 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.979069 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.979153 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.980982 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.981052 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:36 crc kubenswrapper[4803]: I0320 17:16:36.981072 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.268997 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.270597 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.270651 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.270670 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.270704 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.953384 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.956120 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.956176 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:37 crc kubenswrapper[4803]: I0320 17:16:37.956195 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.123389 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.123652 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.123741 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.125736 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.125785 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.125807 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.275238 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.613386 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.613662 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.615464 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.615568 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.615593 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.957194 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.958580 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.958646 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:38 crc kubenswrapper[4803]: I0320 17:16:38.958673 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.462842 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.463134 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.464719 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.464810 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.464838 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.625014 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.625311 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.627244 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.627310 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.627336 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.935351 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.935652 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.937548 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.937620 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:40 crc kubenswrapper[4803]: I0320 17:16:40.937640 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:40 crc kubenswrapper[4803]: E0320 17:16:40.960613 4803 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:16:41 crc kubenswrapper[4803]: I0320 17:16:41.015022 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:41 crc kubenswrapper[4803]: I0320 17:16:41.015269 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:41 crc kubenswrapper[4803]: I0320 17:16:41.016810 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:41 crc kubenswrapper[4803]: I0320 17:16:41.016893 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:41 crc kubenswrapper[4803]: I0320 17:16:41.016911 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:41 crc kubenswrapper[4803]: I0320 17:16:41.614347 4803 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:16:41 crc kubenswrapper[4803]: I0320 17:16:41.614461 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.099770 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.100002 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.101460 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.101552 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.101574 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.107773 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.842197 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.967771 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.968946 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.969007 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.969024 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:42 crc kubenswrapper[4803]: I0320 17:16:42.975546 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:16:43 crc kubenswrapper[4803]: I0320 17:16:43.970037 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:43 crc kubenswrapper[4803]: I0320 17:16:43.971286 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:43 crc kubenswrapper[4803]: I0320 17:16:43.971361 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:43 crc kubenswrapper[4803]: I0320 17:16:43.971383 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:44 crc kubenswrapper[4803]: W0320 17:16:44.741191 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.741307 4803 trace.go:236] Trace[1707511828]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 17:16:34.739) (total time: 10002ms): Mar 20 17:16:44 crc kubenswrapper[4803]: Trace[1707511828]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (17:16:44.741) Mar 20 17:16:44 crc kubenswrapper[4803]: Trace[1707511828]: [10.002118352s] [10.002118352s] END Mar 20 17:16:44 crc kubenswrapper[4803]: E0320 17:16:44.741337 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.778318 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 17:16:44 crc kubenswrapper[4803]: W0320 17:16:44.962373 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z Mar 20 17:16:44 crc kubenswrapper[4803]: E0320 17:16:44.962475 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:44 crc kubenswrapper[4803]: E0320 17:16:44.964874 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.972113 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:44 crc kubenswrapper[4803]: E0320 17:16:44.972447 4803 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:44 crc kubenswrapper[4803]: E0320 17:16:44.972681 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.973489 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.973579 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.973602 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:44 crc kubenswrapper[4803]: W0320 17:16:44.974633 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z Mar 20 17:16:44 crc kubenswrapper[4803]: E0320 17:16:44.974721 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.974848 4803 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.974931 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 17:16:44 crc kubenswrapper[4803]: E0320 17:16:44.976051 4803 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e9c261f2c8b47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.776257351 +0000 UTC m=+0.687849431,LastTimestamp:2026-03-20 17:16:30.776257351 +0000 UTC m=+0.687849431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.979727 4803 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:16:44 crc kubenswrapper[4803]: I0320 17:16:44.979761 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 17:16:44 crc kubenswrapper[4803]: W0320 17:16:44.985016 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z Mar 20 17:16:44 crc kubenswrapper[4803]: E0320 17:16:44.985101 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.781974 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:45Z is after 2026-02-23T05:33:13Z Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.977993 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.981722 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ed63ab71099d5d4faefcb62c40b8eb4d348b96b096d6132e1f4bc411313fdc83" exitCode=255 Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.981793 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ed63ab71099d5d4faefcb62c40b8eb4d348b96b096d6132e1f4bc411313fdc83"} Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.982117 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.983828 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.983898 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.983921 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:45 crc kubenswrapper[4803]: I0320 17:16:45.984958 4803 scope.go:117] "RemoveContainer" containerID="ed63ab71099d5d4faefcb62c40b8eb4d348b96b096d6132e1f4bc411313fdc83" Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.782448 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:46Z is after 2026-02-23T05:33:13Z Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.988211 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.989472 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.992811 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a" exitCode=255 Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.992890 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a"} Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.992978 4803 scope.go:117] "RemoveContainer" containerID="ed63ab71099d5d4faefcb62c40b8eb4d348b96b096d6132e1f4bc411313fdc83" Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.993206 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.994635 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.994680 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.994699 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:46 crc kubenswrapper[4803]: I0320 17:16:46.995502 4803 scope.go:117] "RemoveContainer" containerID="116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a" Mar 20 17:16:46 crc kubenswrapper[4803]: E0320 17:16:46.995855 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:16:47 crc kubenswrapper[4803]: I0320 17:16:47.782800 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:47Z is after 2026-02-23T05:33:13Z Mar 20 17:16:47 crc kubenswrapper[4803]: I0320 17:16:47.998241 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:16:48 crc kubenswrapper[4803]: I0320 17:16:48.132969 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:48 crc kubenswrapper[4803]: I0320 17:16:48.133257 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:48 crc kubenswrapper[4803]: I0320 17:16:48.134818 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:48 crc kubenswrapper[4803]: I0320 17:16:48.134879 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:48 crc kubenswrapper[4803]: I0320 17:16:48.134901 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:48 crc kubenswrapper[4803]: I0320 17:16:48.135829 4803 scope.go:117] "RemoveContainer" containerID="116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a" Mar 20 17:16:48 crc kubenswrapper[4803]: E0320 17:16:48.136122 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:16:48 crc kubenswrapper[4803]: I0320 17:16:48.140316 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:48 crc kubenswrapper[4803]: W0320 17:16:48.496419 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:48Z is after 2026-02-23T05:33:13Z Mar 20 17:16:48 crc kubenswrapper[4803]: E0320 17:16:48.496571 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:48 crc kubenswrapper[4803]: I0320 17:16:48.782089 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:48Z is after 2026-02-23T05:33:13Z Mar 20 17:16:49 crc kubenswrapper[4803]: I0320 17:16:49.004992 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:49 crc kubenswrapper[4803]: I0320 17:16:49.006666 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:49 crc kubenswrapper[4803]: I0320 17:16:49.006744 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:49 crc kubenswrapper[4803]: I0320 17:16:49.006774 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:49 crc kubenswrapper[4803]: I0320 17:16:49.007777 4803 scope.go:117] "RemoveContainer" containerID="116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a" Mar 20 17:16:49 crc kubenswrapper[4803]: E0320 17:16:49.008079 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:16:49 crc kubenswrapper[4803]: W0320 17:16:49.095515 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:49Z is after 2026-02-23T05:33:13Z Mar 20 17:16:49 crc kubenswrapper[4803]: E0320 17:16:49.095688 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:49 crc kubenswrapper[4803]: I0320 17:16:49.782276 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:49Z is after 2026-02-23T05:33:13Z Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.765887 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.766161 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.767975 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.768052 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.768071 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.769071 4803 scope.go:117] "RemoveContainer" containerID="116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a" Mar 20 17:16:50 crc kubenswrapper[4803]: E0320 17:16:50.769363 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.781885 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:50Z is after 2026-02-23T05:33:13Z Mar 20 17:16:50 crc kubenswrapper[4803]: E0320 17:16:50.960883 4803 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.981447 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.981796 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.983400 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.983482 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:50 crc kubenswrapper[4803]: I0320 17:16:50.983511 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.001256 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.010670 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.012002 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.012054 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.012077 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.015700 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.015927 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.017353 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.017407 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.017425 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.018273 4803 scope.go:117] "RemoveContainer" containerID="116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a" Mar 20 17:16:51 crc kubenswrapper[4803]: E0320 17:16:51.018573 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:16:51 crc kubenswrapper[4803]: E0320 17:16:51.372332 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:51Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.373360 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.375224 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.375297 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.375318 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.375361 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:16:51 crc kubenswrapper[4803]: E0320 17:16:51.381843 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:51Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.614703 4803 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.614853 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:16:51 crc kubenswrapper[4803]: I0320 17:16:51.782980 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:51Z is after 2026-02-23T05:33:13Z Mar 20 17:16:52 crc kubenswrapper[4803]: I0320 17:16:52.781866 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:52Z is after 2026-02-23T05:33:13Z Mar 20 17:16:53 crc kubenswrapper[4803]: I0320 17:16:53.475726 4803 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:16:53 crc kubenswrapper[4803]: E0320 17:16:53.481423 4803 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:53 crc kubenswrapper[4803]: I0320 17:16:53.781965 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:53Z is after 2026-02-23T05:33:13Z Mar 20 17:16:54 crc kubenswrapper[4803]: I0320 17:16:54.782606 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:54Z is after 2026-02-23T05:33:13Z Mar 20 17:16:54 crc kubenswrapper[4803]: E0320 17:16:54.982903 4803 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:54Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e9c261f2c8b47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.776257351 +0000 UTC m=+0.687849431,LastTimestamp:2026-03-20 17:16:30.776257351 +0000 UTC m=+0.687849431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:16:55 crc kubenswrapper[4803]: W0320 17:16:55.157451 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:55Z is after 2026-02-23T05:33:13Z Mar 20 17:16:55 crc kubenswrapper[4803]: E0320 17:16:55.157577 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:55 crc kubenswrapper[4803]: I0320 17:16:55.781178 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:55Z is after 2026-02-23T05:33:13Z Mar 20 17:16:56 crc kubenswrapper[4803]: I0320 17:16:56.781618 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:56Z is after 2026-02-23T05:33:13Z Mar 20 17:16:57 crc kubenswrapper[4803]: W0320 17:16:57.212176 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:57Z is after 2026-02-23T05:33:13Z Mar 20 17:16:57 crc kubenswrapper[4803]: E0320 17:16:57.212307 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:57Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:57 crc kubenswrapper[4803]: I0320 17:16:57.781727 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:57Z is after 2026-02-23T05:33:13Z Mar 20 17:16:58 crc kubenswrapper[4803]: E0320 17:16:58.378211 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:58Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 17:16:58 crc kubenswrapper[4803]: I0320 17:16:58.382325 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:16:58 crc kubenswrapper[4803]: I0320 17:16:58.383697 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:16:58 crc kubenswrapper[4803]: I0320 17:16:58.383764 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:16:58 crc kubenswrapper[4803]: I0320 17:16:58.383783 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:16:58 crc kubenswrapper[4803]: I0320 17:16:58.383826 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:16:58 crc kubenswrapper[4803]: E0320 17:16:58.388520 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:58Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 17:16:58 crc kubenswrapper[4803]: I0320 17:16:58.782475 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:58Z is after 2026-02-23T05:33:13Z Mar 20 17:16:59 crc kubenswrapper[4803]: W0320 17:16:59.186634 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:59Z is after 2026-02-23T05:33:13Z Mar 20 17:16:59 crc kubenswrapper[4803]: E0320 17:16:59.186738 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:59 crc kubenswrapper[4803]: W0320 17:16:59.630754 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:59Z is after 2026-02-23T05:33:13Z Mar 20 17:16:59 crc kubenswrapper[4803]: E0320 17:16:59.630879 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 17:16:59 crc kubenswrapper[4803]: I0320 17:16:59.782821 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:16:59Z is after 2026-02-23T05:33:13Z Mar 20 17:17:00 crc kubenswrapper[4803]: I0320 17:17:00.782371 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:00Z is after 2026-02-23T05:33:13Z Mar 20 17:17:00 crc kubenswrapper[4803]: E0320 17:17:00.961310 4803 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.615176 4803 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.615312 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.615425 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.615670 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.617093 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.617161 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.617182 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.617923 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"6137f4411a10c6454e1635ec8d76f38c06b337851b01fcd3ac19033e5e399276"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.618166 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://6137f4411a10c6454e1635ec8d76f38c06b337851b01fcd3ac19033e5e399276" gracePeriod=30 Mar 20 17:17:01 crc kubenswrapper[4803]: I0320 17:17:01.782285 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:01Z is after 2026-02-23T05:33:13Z Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.047226 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.048243 4803 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="6137f4411a10c6454e1635ec8d76f38c06b337851b01fcd3ac19033e5e399276" exitCode=255 Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.048304 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"6137f4411a10c6454e1635ec8d76f38c06b337851b01fcd3ac19033e5e399276"} Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.048420 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"de36bfd2ca58e6ea80b545b19592d0cdcd98ec523c30d2b67c0666bf797175a7"} Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.048609 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.049980 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.050067 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.050092 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.782495 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:02Z is after 2026-02-23T05:33:13Z Mar 20 17:17:02 crc kubenswrapper[4803]: I0320 17:17:02.842675 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:03 crc kubenswrapper[4803]: I0320 17:17:03.052205 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:03 crc kubenswrapper[4803]: I0320 17:17:03.053782 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:03 crc kubenswrapper[4803]: I0320 17:17:03.053852 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:03 crc kubenswrapper[4803]: I0320 17:17:03.053878 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:03 crc kubenswrapper[4803]: I0320 17:17:03.779649 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:17:03Z is after 2026-02-23T05:33:13Z Mar 20 17:17:04 crc kubenswrapper[4803]: I0320 17:17:04.786912 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:04 crc kubenswrapper[4803]: E0320 17:17:04.991737 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c261f2c8b47 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.776257351 +0000 UTC m=+0.687849431,LastTimestamp:2026-03-20 17:16:30.776257351 +0000 UTC m=+0.687849431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:04 crc kubenswrapper[4803]: E0320 17:17:04.998278 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed134b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,LastTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.006305 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed8304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,LastTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.013253 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623edb4e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856025318 +0000 UTC m=+0.767617398,LastTimestamp:2026-03-20 17:16:30.856025318 +0000 UTC m=+0.767617398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.020375 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c262a677242 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.964666946 +0000 UTC m=+0.876259056,LastTimestamp:2026-03-20 17:16:30.964666946 +0000 UTC m=+0.876259056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.029610 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed134b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed134b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,LastTimestamp:2026-03-20 17:16:31.052098853 +0000 UTC m=+0.963690953,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.036844 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed8304\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed8304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,LastTimestamp:2026-03-20 17:16:31.052179994 +0000 UTC m=+0.963772094,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.050143 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623edb4e6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623edb4e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856025318 +0000 UTC m=+0.767617398,LastTimestamp:2026-03-20 17:16:31.052198315 +0000 UTC m=+0.963790425,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.059091 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed134b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed134b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,LastTimestamp:2026-03-20 17:16:31.150362289 +0000 UTC m=+1.061954399,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.066209 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed8304\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed8304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,LastTimestamp:2026-03-20 17:16:31.15041155 +0000 UTC m=+1.062003660,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.073448 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623edb4e6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623edb4e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856025318 +0000 UTC m=+0.767617398,LastTimestamp:2026-03-20 17:16:31.15043396 +0000 UTC m=+1.062026060,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.080007 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed134b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed134b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,LastTimestamp:2026-03-20 17:16:31.152617594 +0000 UTC m=+1.064209694,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.086759 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed8304\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed8304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,LastTimestamp:2026-03-20 17:16:31.152660485 +0000 UTC m=+1.064252595,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.093453 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623edb4e6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623edb4e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856025318 +0000 UTC m=+0.767617398,LastTimestamp:2026-03-20 17:16:31.152676275 +0000 UTC m=+1.064268375,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.100276 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed134b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed134b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,LastTimestamp:2026-03-20 17:16:31.155348067 +0000 UTC m=+1.066940167,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.104994 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed8304\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed8304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,LastTimestamp:2026-03-20 17:16:31.155371737 +0000 UTC m=+1.066963847,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.110908 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623edb4e6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623edb4e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856025318 +0000 UTC m=+0.767617398,LastTimestamp:2026-03-20 17:16:31.155388708 +0000 UTC m=+1.066980808,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.117167 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed134b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed134b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,LastTimestamp:2026-03-20 17:16:31.155872765 +0000 UTC m=+1.067464865,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.123072 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed8304\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed8304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,LastTimestamp:2026-03-20 17:16:31.155897996 +0000 UTC m=+1.067490096,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.130406 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623edb4e6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623edb4e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856025318 +0000 UTC m=+0.767617398,LastTimestamp:2026-03-20 17:16:31.155914906 +0000 UTC m=+1.067507016,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.134655 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed134b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed134b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,LastTimestamp:2026-03-20 17:16:31.156045638 +0000 UTC m=+1.067637748,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.141771 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed8304\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed8304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,LastTimestamp:2026-03-20 17:16:31.15617796 +0000 UTC m=+1.067770100,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.149359 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623edb4e6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623edb4e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856025318 +0000 UTC m=+0.767617398,LastTimestamp:2026-03-20 17:16:31.156241091 +0000 UTC m=+1.067833191,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.156652 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed134b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed134b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.855983947 +0000 UTC m=+0.767576027,LastTimestamp:2026-03-20 17:16:31.159788546 +0000 UTC m=+1.071380656,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.163704 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e9c2623ed8304\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e9c2623ed8304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:30.856012548 +0000 UTC m=+0.767604628,LastTimestamp:2026-03-20 17:16:31.159811907 +0000 UTC m=+1.071404007,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.172422 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c264d4064ae openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:31.549310126 +0000 UTC m=+1.460902226,LastTimestamp:2026-03-20 17:16:31.549310126 +0000 UTC m=+1.460902226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.179947 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c264d419fc3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:31.549390787 +0000 UTC m=+1.460982897,LastTimestamp:2026-03-20 17:16:31.549390787 +0000 UTC m=+1.460982897,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.186496 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c264e152426 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:31.563252774 +0000 UTC m=+1.474844884,LastTimestamp:2026-03-20 17:16:31.563252774 +0000 UTC m=+1.474844884,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.193707 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26501ca7dc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:31.597299676 +0000 UTC m=+1.508891786,LastTimestamp:2026-03-20 17:16:31.597299676 +0000 UTC m=+1.508891786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.201094 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c26512d38f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:31.615162615 +0000 UTC m=+1.526754685,LastTimestamp:2026-03-20 17:16:31.615162615 +0000 UTC m=+1.526754685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.207443 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c2671662b20 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.155765536 +0000 UTC m=+2.067357636,LastTimestamp:2026-03-20 17:16:32.155765536 +0000 UTC m=+2.067357636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.210859 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2671a3342d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.159765549 +0000 UTC m=+2.071357629,LastTimestamp:2026-03-20 17:16:32.159765549 +0000 UTC m=+2.071357629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.214090 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c2671b753e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.161084389 +0000 UTC m=+2.072676459,LastTimestamp:2026-03-20 17:16:32.161084389 +0000 UTC m=+2.072676459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.218644 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2671ba5e76 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.161283702 +0000 UTC m=+2.072875792,LastTimestamp:2026-03-20 17:16:32.161283702 +0000 UTC m=+2.072875792,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.221266 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c2671d1a490 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.162808976 +0000 UTC m=+2.074401046,LastTimestamp:2026-03-20 17:16:32.162808976 +0000 UTC m=+2.074401046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.224875 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c2671f992fd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.165425917 +0000 UTC m=+2.077018017,LastTimestamp:2026-03-20 17:16:32.165425917 +0000 UTC m=+2.077018017,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.228664 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c26729fbdf9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.176315897 +0000 UTC m=+2.087907967,LastTimestamp:2026-03-20 17:16:32.176315897 +0000 UTC m=+2.087907967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.234394 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c2672a175bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.176428479 +0000 UTC m=+2.088020579,LastTimestamp:2026-03-20 17:16:32.176428479 +0000 UTC m=+2.088020579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.240941 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2672b3a04a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.177619018 +0000 UTC m=+2.089211128,LastTimestamp:2026-03-20 17:16:32.177619018 +0000 UTC m=+2.089211128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.246960 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2672d4b4dc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.179786972 +0000 UTC m=+2.091379082,LastTimestamp:2026-03-20 17:16:32.179786972 +0000 UTC m=+2.091379082,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.253940 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c2672e0ca1c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.180578844 +0000 UTC m=+2.092170964,LastTimestamp:2026-03-20 17:16:32.180578844 +0000 UTC m=+2.092170964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.261285 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2686739149 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.508965193 +0000 UTC m=+2.420557303,LastTimestamp:2026-03-20 17:16:32.508965193 +0000 UTC m=+2.420557303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.268164 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c268796285a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.528009306 +0000 UTC m=+2.439601406,LastTimestamp:2026-03-20 17:16:32.528009306 +0000 UTC m=+2.439601406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.274406 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2687aff603 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.529700355 +0000 UTC m=+2.441292455,LastTimestamp:2026-03-20 17:16:32.529700355 +0000 UTC m=+2.441292455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.281504 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2696dc243d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.784254013 +0000 UTC m=+2.695846113,LastTimestamp:2026-03-20 17:16:32.784254013 +0000 UTC m=+2.695846113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.287644 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2697c8e60c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.799770124 +0000 UTC m=+2.711362234,LastTimestamp:2026-03-20 17:16:32.799770124 +0000 UTC m=+2.711362234,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.294373 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2697e210bc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.801419452 +0000 UTC m=+2.713011552,LastTimestamp:2026-03-20 17:16:32.801419452 +0000 UTC m=+2.713011552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.301657 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c269cb51051 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.882356305 +0000 UTC m=+2.793948415,LastTimestamp:2026-03-20 17:16:32.882356305 +0000 UTC m=+2.793948415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.309142 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c269ce40194 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.885432724 +0000 UTC m=+2.797024834,LastTimestamp:2026-03-20 17:16:32.885432724 +0000 UTC m=+2.797024834,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.319068 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c269d831aed openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.895859437 +0000 UTC m=+2.807451547,LastTimestamp:2026-03-20 17:16:32.895859437 +0000 UTC m=+2.807451547,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.326671 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c269d83aed2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.895897298 +0000 UTC m=+2.807489408,LastTimestamp:2026-03-20 17:16:32.895897298 +0000 UTC m=+2.807489408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.333454 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c26a6c9927e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.05147251 +0000 UTC m=+2.963064590,LastTimestamp:2026-03-20 17:16:33.05147251 +0000 UTC m=+2.963064590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.340760 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c26a89ac628 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.081959976 +0000 UTC m=+2.993552046,LastTimestamp:2026-03-20 17:16:33.081959976 +0000 UTC m=+2.993552046,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.348773 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c26aa4d15ef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.110423023 +0000 UTC m=+3.022015103,LastTimestamp:2026-03-20 17:16:33.110423023 +0000 UTC m=+3.022015103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.355870 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26aa5c4c26 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.111419942 +0000 UTC m=+3.023012002,LastTimestamp:2026-03-20 17:16:33.111419942 +0000 UTC m=+3.023012002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.363064 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c26ab8e3f49 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.131470665 +0000 UTC m=+3.043062745,LastTimestamp:2026-03-20 17:16:33.131470665 +0000 UTC m=+3.043062745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.369016 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26abe6b343 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.137267523 +0000 UTC m=+3.048859603,LastTimestamp:2026-03-20 17:16:33.137267523 +0000 UTC m=+3.048859603,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.375117 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c26abed0ebb openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.137684155 +0000 UTC m=+3.049276235,LastTimestamp:2026-03-20 17:16:33.137684155 +0000 UTC m=+3.049276235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.381597 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26ac0359da openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.139145178 +0000 UTC m=+3.050737268,LastTimestamp:2026-03-20 17:16:33.139145178 +0000 UTC m=+3.050737268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.382012 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.388644 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.388743 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c26ac2bf7a7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.141807015 +0000 UTC m=+3.053399095,LastTimestamp:2026-03-20 17:16:33.141807015 +0000 UTC m=+3.053399095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.390600 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.390691 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.390710 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.390745 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.395196 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.395914 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c26ac8fde43 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.148354115 +0000 UTC m=+3.059946195,LastTimestamp:2026-03-20 17:16:33.148354115 +0000 UTC m=+3.059946195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.399689 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c26accfcfeb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.152544747 +0000 UTC m=+3.064136827,LastTimestamp:2026-03-20 17:16:33.152544747 +0000 UTC m=+3.064136827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.403637 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e9c26ad37e525 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.159365925 +0000 UTC m=+3.070957985,LastTimestamp:2026-03-20 17:16:33.159365925 +0000 UTC m=+3.070957985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.409893 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26b6a20523 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.317315875 +0000 UTC m=+3.228907935,LastTimestamp:2026-03-20 17:16:33.317315875 +0000 UTC m=+3.228907935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.416420 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26b799ef01 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.333563137 +0000 UTC m=+3.245155237,LastTimestamp:2026-03-20 17:16:33.333563137 +0000 UTC m=+3.245155237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.423272 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26b7b16190 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.335099792 +0000 UTC m=+3.246691862,LastTimestamp:2026-03-20 17:16:33.335099792 +0000 UTC m=+3.246691862,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.430018 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c26b98bc70b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.366189835 +0000 UTC m=+3.277781945,LastTimestamp:2026-03-20 17:16:33.366189835 +0000 UTC m=+3.277781945,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.436753 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c26ba889ec9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.382760137 +0000 UTC m=+3.294352247,LastTimestamp:2026-03-20 17:16:33.382760137 +0000 UTC m=+3.294352247,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.444063 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c26ba9a0126 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.38389943 +0000 UTC m=+3.295491540,LastTimestamp:2026-03-20 17:16:33.38389943 +0000 UTC m=+3.295491540,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.450799 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26c8f54f8c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.6247643 +0000 UTC m=+3.536356410,LastTimestamp:2026-03-20 17:16:33.6247643 +0000 UTC m=+3.536356410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.457890 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c26ca1ff2b7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.644335799 +0000 UTC m=+3.555927889,LastTimestamp:2026-03-20 17:16:33.644335799 +0000 UTC m=+3.555927889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.465338 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26cadacefd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.656581885 +0000 UTC m=+3.568173985,LastTimestamp:2026-03-20 17:16:33.656581885 +0000 UTC m=+3.568173985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.473187 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26caed591b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.657796891 +0000 UTC m=+3.569388981,LastTimestamp:2026-03-20 17:16:33.657796891 +0000 UTC m=+3.569388981,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.481480 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e9c26cbc54160 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.671946592 +0000 UTC m=+3.583538672,LastTimestamp:2026-03-20 17:16:33.671946592 +0000 UTC m=+3.583538672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.488576 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26d8de25ac openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.891681708 +0000 UTC m=+3.803273778,LastTimestamp:2026-03-20 17:16:33.891681708 +0000 UTC m=+3.803273778,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.496704 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c26d994bb08 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.903647496 +0000 UTC m=+3.815239566,LastTimestamp:2026-03-20 17:16:33.903647496 +0000 UTC m=+3.815239566,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.504339 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26da596f88 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.91653876 +0000 UTC m=+3.828130830,LastTimestamp:2026-03-20 17:16:33.91653876 +0000 UTC m=+3.828130830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.510435 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26da6df24f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.917882959 +0000 UTC m=+3.829475069,LastTimestamp:2026-03-20 17:16:33.917882959 +0000 UTC m=+3.829475069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.516199 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c26e7988859 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:34.138777689 +0000 UTC m=+4.050369759,LastTimestamp:2026-03-20 17:16:34.138777689 +0000 UTC m=+4.050369759,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.522452 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26e7c93c0e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:34.141969422 +0000 UTC m=+4.053561492,LastTimestamp:2026-03-20 17:16:34.141969422 +0000 UTC m=+4.053561492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.528871 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c26e86c704b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:34.152665163 +0000 UTC m=+4.064257223,LastTimestamp:2026-03-20 17:16:34.152665163 +0000 UTC m=+4.064257223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.534628 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26e90d64f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:34.16321356 +0000 UTC m=+4.074805630,LastTimestamp:2026-03-20 17:16:34.16321356 +0000 UTC m=+4.074805630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.543471 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2717035016 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:34.93430479 +0000 UTC m=+4.845896870,LastTimestamp:2026-03-20 17:16:34.93430479 +0000 UTC m=+4.845896870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.548781 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2726579bfd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.191487485 +0000 UTC m=+5.103079595,LastTimestamp:2026-03-20 17:16:35.191487485 +0000 UTC m=+5.103079595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.553653 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c27270e0b92 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.203443602 +0000 UTC m=+5.115035702,LastTimestamp:2026-03-20 17:16:35.203443602 +0000 UTC m=+5.115035702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.559916 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2727288d97 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.205180823 +0000 UTC m=+5.116772923,LastTimestamp:2026-03-20 17:16:35.205180823 +0000 UTC m=+5.116772923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.568571 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c273853897e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.493210494 +0000 UTC m=+5.404802594,LastTimestamp:2026-03-20 17:16:35.493210494 +0000 UTC m=+5.404802594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.574782 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c273943c855 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.508955221 +0000 UTC m=+5.420547322,LastTimestamp:2026-03-20 17:16:35.508955221 +0000 UTC m=+5.420547322,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.580480 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c273954f8b2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.510081714 +0000 UTC m=+5.421673814,LastTimestamp:2026-03-20 17:16:35.510081714 +0000 UTC m=+5.421673814,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.586572 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2747d4d3a8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.753341864 +0000 UTC m=+5.664933944,LastTimestamp:2026-03-20 17:16:35.753341864 +0000 UTC m=+5.664933944,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.594460 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2748933f03 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.765821187 +0000 UTC m=+5.677413297,LastTimestamp:2026-03-20 17:16:35.765821187 +0000 UTC m=+5.677413297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.600320 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2748ab9b49 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:35.767417673 +0000 UTC m=+5.679009783,LastTimestamp:2026-03-20 17:16:35.767417673 +0000 UTC m=+5.679009783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.607584 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2758d3ec5a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:36.038495322 +0000 UTC m=+5.950087432,LastTimestamp:2026-03-20 17:16:36.038495322 +0000 UTC m=+5.950087432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.614183 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2759bc6d11 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:36.053732625 +0000 UTC m=+5.965324725,LastTimestamp:2026-03-20 17:16:36.053732625 +0000 UTC m=+5.965324725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.620244 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c2759d59ebf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:36.055383743 +0000 UTC m=+5.966975853,LastTimestamp:2026-03-20 17:16:36.055383743 +0000 UTC m=+5.966975853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.622262 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c276952695a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:36.315220314 +0000 UTC m=+6.226812424,LastTimestamp:2026-03-20 17:16:36.315220314 +0000 UTC m=+6.226812424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.624901 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e9c276a50dbb1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:36.331895729 +0000 UTC m=+6.243487809,LastTimestamp:2026-03-20 17:16:36.331895729 +0000 UTC m=+6.243487809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.629130 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:17:05 crc kubenswrapper[4803]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c28a52de7c9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 17:17:05 crc kubenswrapper[4803]: body: Mar 20 17:17:05 crc kubenswrapper[4803]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:41.614428105 +0000 UTC m=+11.526020205,LastTimestamp:2026-03-20 17:16:41.614428105 +0000 UTC m=+11.526020205,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:17:05 crc kubenswrapper[4803]: > Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.634019 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c28a52f1d9a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:41.614507418 +0000 UTC m=+11.526099528,LastTimestamp:2026-03-20 17:16:41.614507418 +0000 UTC m=+11.526099528,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.641901 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:17:05 crc kubenswrapper[4803]: &Event{ObjectMeta:{kube-apiserver-crc.189e9c296d7aaad3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 17:17:05 crc kubenswrapper[4803]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:17:05 crc kubenswrapper[4803]: Mar 20 17:17:05 crc kubenswrapper[4803]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:44.974901971 +0000 UTC m=+14.886494051,LastTimestamp:2026-03-20 17:16:44.974901971 +0000 UTC m=+14.886494051,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:17:05 crc kubenswrapper[4803]: > Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.647554 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c296d7b937d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:44.974961533 +0000 UTC m=+14.886553613,LastTimestamp:2026-03-20 17:16:44.974961533 +0000 UTC m=+14.886553613,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.652992 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c296d7aaad3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 17:17:05 crc kubenswrapper[4803]: &Event{ObjectMeta:{kube-apiserver-crc.189e9c296d7aaad3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 17:17:05 crc kubenswrapper[4803]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 17:17:05 crc kubenswrapper[4803]: Mar 20 17:17:05 crc kubenswrapper[4803]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:44.974901971 +0000 UTC m=+14.886494051,LastTimestamp:2026-03-20 17:16:44.979750532 +0000 UTC m=+14.891342612,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:17:05 crc kubenswrapper[4803]: > Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.658460 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c296d7b937d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c296d7b937d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:44.974961533 +0000 UTC m=+14.886553613,LastTimestamp:2026-03-20 17:16:44.979779823 +0000 UTC m=+14.891371903,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.663818 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c26da6df24f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26da6df24f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:33.917882959 +0000 UTC m=+3.829475069,LastTimestamp:2026-03-20 17:16:45.987550162 +0000 UTC m=+15.899142272,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.668233 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c26e7c93c0e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26e7c93c0e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:34.141969422 +0000 UTC m=+4.053561492,LastTimestamp:2026-03-20 17:16:46.178176613 +0000 UTC m=+16.089768703,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.673565 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e9c26e90d64f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e9c26e90d64f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:34.16321356 +0000 UTC m=+4.074805630,LastTimestamp:2026-03-20 17:16:46.187940386 +0000 UTC m=+16.099532456,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.682369 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c28a52de7c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:17:05 crc kubenswrapper[4803]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c28a52de7c9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 17:17:05 crc kubenswrapper[4803]: body: Mar 20 17:17:05 crc kubenswrapper[4803]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:41.614428105 +0000 UTC m=+11.526020205,LastTimestamp:2026-03-20 17:16:51.614801567 +0000 UTC m=+21.526393677,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:17:05 crc kubenswrapper[4803]: > Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.688503 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c28a52f1d9a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c28a52f1d9a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:41.614507418 +0000 UTC m=+11.526099528,LastTimestamp:2026-03-20 17:16:51.61490805 +0000 UTC m=+21.526500150,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.694598 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:17:05 crc kubenswrapper[4803]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c2d4d525efc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:17:05 crc kubenswrapper[4803]: body: Mar 20 17:17:05 crc kubenswrapper[4803]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:01.615259388 +0000 UTC m=+31.526851498,LastTimestamp:2026-03-20 17:17:01.615259388 +0000 UTC m=+31.526851498,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:17:05 crc kubenswrapper[4803]: > Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.699780 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2d4d543487 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:01.615379591 +0000 UTC m=+31.526971711,LastTimestamp:2026-03-20 17:17:01.615379591 +0000 UTC m=+31.526971711,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.706291 4803 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2d4d7e561c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:01.6181407 +0000 UTC m=+31.529732810,LastTimestamp:2026-03-20 17:17:01.6181407 +0000 UTC m=+31.529732810,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.712301 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c2672b3a04a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2672b3a04a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.177619018 +0000 UTC m=+2.089211128,LastTimestamp:2026-03-20 17:17:01.743134176 +0000 UTC m=+31.654726286,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.717899 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c2686739149\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2686739149 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.508965193 +0000 UTC m=+2.420557303,LastTimestamp:2026-03-20 17:17:02.000470427 +0000 UTC m=+31.912062517,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: E0320 17:17:05.725634 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c268796285a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c268796285a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:16:32.528009306 +0000 UTC m=+2.439601406,LastTimestamp:2026-03-20 17:17:02.014077287 +0000 UTC m=+31.925669397,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.779003 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.848090 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.850006 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.850058 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.850071 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:05 crc kubenswrapper[4803]: I0320 17:17:05.850889 4803 scope.go:117] "RemoveContainer" containerID="116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a" Mar 20 17:17:06 crc kubenswrapper[4803]: I0320 17:17:06.784781 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.067643 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.068275 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.071637 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d" exitCode=255 Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.071710 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d"} Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.071770 4803 scope.go:117] "RemoveContainer" containerID="116f3e42f190f4c4a57408e680f812fda3cf33765379d6dfa768d70e4cbdea4a" Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.071944 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.073308 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.073360 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.073379 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.074280 4803 scope.go:117] "RemoveContainer" containerID="02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d" Mar 20 17:17:07 crc kubenswrapper[4803]: E0320 17:17:07.074616 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:17:07 crc kubenswrapper[4803]: I0320 17:17:07.781968 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:08 crc kubenswrapper[4803]: I0320 17:17:08.077469 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:17:08 crc kubenswrapper[4803]: W0320 17:17:08.489662 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 17:17:08 crc kubenswrapper[4803]: E0320 17:17:08.490071 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:17:08 crc kubenswrapper[4803]: I0320 17:17:08.614116 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:08 crc kubenswrapper[4803]: I0320 17:17:08.614388 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:08 crc kubenswrapper[4803]: I0320 17:17:08.615859 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:08 crc kubenswrapper[4803]: I0320 17:17:08.615907 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:08 crc kubenswrapper[4803]: I0320 17:17:08.615927 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:08 crc kubenswrapper[4803]: I0320 17:17:08.783672 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:09 crc kubenswrapper[4803]: I0320 17:17:09.785384 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.677563 4803 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.698442 4803 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.765861 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.766048 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.767234 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.767278 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.767312 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.767848 4803 scope.go:117] "RemoveContainer" containerID="02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d" Mar 20 17:17:10 crc kubenswrapper[4803]: E0320 17:17:10.768028 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:17:10 crc kubenswrapper[4803]: I0320 17:17:10.784471 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:10 crc kubenswrapper[4803]: W0320 17:17:10.947164 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:10 crc kubenswrapper[4803]: E0320 17:17:10.947231 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:17:10 crc kubenswrapper[4803]: E0320 17:17:10.961883 4803 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.015916 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.088201 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.089614 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.089698 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.089720 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.090910 4803 scope.go:117] "RemoveContainer" containerID="02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d" Mar 20 17:17:11 crc kubenswrapper[4803]: E0320 17:17:11.091278 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.614973 4803 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.615056 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:17:11 crc kubenswrapper[4803]: E0320 17:17:11.623185 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c2d4d525efc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 17:17:11 crc kubenswrapper[4803]: &Event{ObjectMeta:{kube-controller-manager-crc.189e9c2d4d525efc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 17:17:11 crc kubenswrapper[4803]: body: Mar 20 17:17:11 crc kubenswrapper[4803]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:01.615259388 +0000 UTC m=+31.526851498,LastTimestamp:2026-03-20 17:17:11.615035861 +0000 UTC m=+41.526627961,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 17:17:11 crc kubenswrapper[4803]: > Mar 20 17:17:11 crc kubenswrapper[4803]: E0320 17:17:11.631369 4803 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e9c2d4d543487\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e9c2d4d543487 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:17:01.615379591 +0000 UTC m=+31.526971711,LastTimestamp:2026-03-20 17:17:11.615146774 +0000 UTC m=+41.526738874,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:17:11 crc kubenswrapper[4803]: I0320 17:17:11.784466 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:12 crc kubenswrapper[4803]: E0320 17:17:12.390844 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:17:12 crc kubenswrapper[4803]: I0320 17:17:12.395943 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:12 crc kubenswrapper[4803]: I0320 17:17:12.398689 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:12 crc kubenswrapper[4803]: I0320 17:17:12.398759 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:12 crc kubenswrapper[4803]: I0320 17:17:12.398784 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:12 crc kubenswrapper[4803]: I0320 17:17:12.398868 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:12 crc kubenswrapper[4803]: E0320 17:17:12.406069 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:17:12 crc kubenswrapper[4803]: I0320 17:17:12.785405 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:13 crc kubenswrapper[4803]: I0320 17:17:13.784469 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:14 crc kubenswrapper[4803]: I0320 17:17:14.780958 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:14 crc kubenswrapper[4803]: W0320 17:17:14.925011 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 17:17:14 crc kubenswrapper[4803]: E0320 17:17:14.925069 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 17:17:15 crc kubenswrapper[4803]: I0320 17:17:15.783644 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:16 crc kubenswrapper[4803]: I0320 17:17:16.781448 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:17 crc kubenswrapper[4803]: I0320 17:17:17.781293 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:18 crc kubenswrapper[4803]: I0320 17:17:18.628215 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:18 crc kubenswrapper[4803]: I0320 17:17:18.629148 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:18 crc kubenswrapper[4803]: I0320 17:17:18.630545 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:18 crc kubenswrapper[4803]: I0320 17:17:18.630587 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:18 crc kubenswrapper[4803]: I0320 17:17:18.630598 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:18 crc kubenswrapper[4803]: I0320 17:17:18.633555 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:17:18 crc kubenswrapper[4803]: I0320 17:17:18.781630 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.108394 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.109284 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.109400 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.109486 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:19 crc kubenswrapper[4803]: E0320 17:17:19.398559 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.406730 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.408133 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.408171 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.408186 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.408238 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:19 crc kubenswrapper[4803]: E0320 17:17:19.415144 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:17:19 crc kubenswrapper[4803]: I0320 17:17:19.781516 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:20 crc kubenswrapper[4803]: W0320 17:17:20.324359 4803 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 17:17:20 crc kubenswrapper[4803]: E0320 17:17:20.324433 4803 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 17:17:20 crc kubenswrapper[4803]: I0320 17:17:20.472967 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 17:17:20 crc kubenswrapper[4803]: I0320 17:17:20.473143 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:20 crc kubenswrapper[4803]: I0320 17:17:20.474448 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:20 crc kubenswrapper[4803]: I0320 17:17:20.474479 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:20 crc kubenswrapper[4803]: I0320 17:17:20.474492 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:20 crc kubenswrapper[4803]: I0320 17:17:20.783411 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:20 crc kubenswrapper[4803]: E0320 17:17:20.962048 4803 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:17:21 crc kubenswrapper[4803]: I0320 17:17:21.780252 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:22 crc kubenswrapper[4803]: I0320 17:17:22.781157 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:23 crc kubenswrapper[4803]: I0320 17:17:23.785238 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:24 crc kubenswrapper[4803]: I0320 17:17:24.783757 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:25 crc kubenswrapper[4803]: I0320 17:17:25.781991 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:25 crc kubenswrapper[4803]: I0320 17:17:25.847217 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:25 crc kubenswrapper[4803]: I0320 17:17:25.848380 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:25 crc kubenswrapper[4803]: I0320 17:17:25.848438 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:25 crc kubenswrapper[4803]: I0320 17:17:25.848457 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:25 crc kubenswrapper[4803]: I0320 17:17:25.849302 4803 scope.go:117] "RemoveContainer" containerID="02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d" Mar 20 17:17:25 crc kubenswrapper[4803]: E0320 17:17:25.849623 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 17:17:26 crc kubenswrapper[4803]: E0320 17:17:26.407866 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:17:26 crc kubenswrapper[4803]: I0320 17:17:26.416128 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:26 crc kubenswrapper[4803]: I0320 17:17:26.417340 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:26 crc kubenswrapper[4803]: I0320 17:17:26.417390 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:26 crc kubenswrapper[4803]: I0320 17:17:26.417402 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:26 crc kubenswrapper[4803]: I0320 17:17:26.417424 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:26 crc kubenswrapper[4803]: E0320 17:17:26.423160 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:17:26 crc kubenswrapper[4803]: I0320 17:17:26.784766 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:27 crc kubenswrapper[4803]: I0320 17:17:27.785507 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:28 crc kubenswrapper[4803]: I0320 17:17:28.784188 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:29 crc kubenswrapper[4803]: I0320 17:17:29.784579 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:30 crc kubenswrapper[4803]: I0320 17:17:30.785045 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:30 crc kubenswrapper[4803]: E0320 17:17:30.962341 4803 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 17:17:31 crc kubenswrapper[4803]: I0320 17:17:31.783956 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:32 crc kubenswrapper[4803]: I0320 17:17:32.785267 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:33 crc kubenswrapper[4803]: E0320 17:17:33.412967 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 17:17:33 crc kubenswrapper[4803]: I0320 17:17:33.423245 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:33 crc kubenswrapper[4803]: I0320 17:17:33.424880 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:33 crc kubenswrapper[4803]: I0320 17:17:33.424927 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:33 crc kubenswrapper[4803]: I0320 17:17:33.424949 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:33 crc kubenswrapper[4803]: I0320 17:17:33.424990 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:33 crc kubenswrapper[4803]: E0320 17:17:33.433907 4803 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 17:17:33 crc kubenswrapper[4803]: I0320 17:17:33.783690 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:34 crc kubenswrapper[4803]: I0320 17:17:34.784138 4803 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 17:17:35 crc kubenswrapper[4803]: I0320 17:17:35.436398 4803 csr.go:261] certificate signing request csr-9wm6m is approved, waiting to be issued Mar 20 17:17:35 crc kubenswrapper[4803]: I0320 17:17:35.449013 4803 csr.go:257] certificate signing request csr-9wm6m is issued Mar 20 17:17:35 crc kubenswrapper[4803]: I0320 17:17:35.554007 4803 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 17:17:35 crc kubenswrapper[4803]: I0320 17:17:35.613849 4803 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 17:17:36 crc kubenswrapper[4803]: I0320 17:17:36.450178 4803 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-24 13:11:44.767630765 +0000 UTC Mar 20 17:17:36 crc kubenswrapper[4803]: I0320 17:17:36.450235 4803 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6691h54m8.31740065s for next certificate rotation Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.736189 4803 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.805420 4803 apiserver.go:52] "Watching apiserver" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.812226 4803 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.812830 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zp898","openshift-multus/multus-additional-cni-plugins-56rll","openshift-network-diagnostics/network-check-target-xd92c","openshift-ovn-kubernetes/ovnkube-node-4v5dx","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-image-registry/node-ca-l5swf","openshift-machine-config-operator/machine-config-daemon-26nll","openshift-multus/multus-d8jn6","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.813419 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.813505 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.813556 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.813565 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.813669 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.813887 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:37 crc kubenswrapper[4803]: E0320 17:17:37.814007 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.814644 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zp898" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.814747 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:37 crc kubenswrapper[4803]: E0320 17:17:37.814992 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.815147 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.815541 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.815573 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:37 crc kubenswrapper[4803]: E0320 17:17:37.815669 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.815937 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d8jn6" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.820334 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.820588 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.820727 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.820754 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.820774 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.820823 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.820984 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.821087 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.821136 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.821377 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.822469 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.822555 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.822603 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.822665 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.822760 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823087 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823173 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823176 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823212 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823255 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823273 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823201 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823382 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823483 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823744 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823752 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.823860 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.824103 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.824250 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.824300 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.824241 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.824351 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.824414 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.824473 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.826788 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.838889 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.866182 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.890333 4803 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.891047 4803 scope.go:117] "RemoveContainer" containerID="02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.897725 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.909063 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.923464 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.936285 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.947790 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.961893 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971662 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971704 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971729 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971754 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971779 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971801 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971821 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971843 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971864 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971894 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971915 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971936 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.971936 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972027 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972149 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972205 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972228 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972252 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972459 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972493 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972534 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972557 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972553 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972583 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972715 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973247 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973432 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974097 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974137 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974163 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974191 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974214 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974234 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974250 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974265 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974282 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974298 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974326 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974342 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974359 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974376 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974393 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974409 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974426 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974444 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974460 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974478 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974493 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974509 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974542 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974563 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974578 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974593 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974608 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974623 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974636 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974651 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974666 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974684 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974700 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974717 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974732 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972959 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.972980 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973001 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973249 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973296 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973361 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973649 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973721 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.973971 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974666 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974707 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: E0320 17:17:37.974756 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:17:38.474738132 +0000 UTC m=+68.386330202 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976204 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976239 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974899 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.974977 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.975191 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976075 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976275 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976368 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976397 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976405 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976424 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976439 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976450 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976480 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976508 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976551 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976576 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976600 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976623 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976648 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976661 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976704 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976671 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976818 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976825 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976844 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976983 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.977097 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.977676 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.977884 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.978202 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.978269 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.978628 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.978757 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.978843 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.979061 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.979373 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.979501 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.979710 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.979898 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980037 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.976823 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980307 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980282 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.979539 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980881 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980911 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980931 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980950 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980972 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.980993 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981009 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981027 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981047 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981064 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981080 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981097 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981113 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981128 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981144 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981161 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981177 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981197 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981213 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981231 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981247 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981262 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981288 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981305 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981321 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981339 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981355 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981371 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981387 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981404 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981411 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981421 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981440 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981457 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981473 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981489 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981507 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981549 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981567 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981582 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981596 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981611 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981635 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981650 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981667 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981683 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981700 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981700 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981716 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981736 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981752 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981768 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981784 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981801 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981817 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981837 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981854 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981871 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981886 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981905 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981918 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981922 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.981991 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982014 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982035 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982058 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982106 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982133 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982155 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982173 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982192 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982209 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982228 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982249 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982268 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982284 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982288 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982329 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982353 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982375 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982393 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982410 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982493 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982555 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982577 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982707 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982728 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982745 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982763 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982781 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982798 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982815 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982833 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982862 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982871 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.983900 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.984247 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.982921 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.984328 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.984690 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.984740 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.984929 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.984972 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985087 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985123 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985297 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985439 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985488 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985514 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985584 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985648 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.985935 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.986469 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.986727 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.986754 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.986794 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.987401 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.987496 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.987617 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.988272 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.988246 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.988452 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.988493 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.989041 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.989136 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.989191 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.989245 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.989390 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.989568 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.989608 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.989781 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.990140 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.990847 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.991461 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.991639 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.991967 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.990721 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.992264 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.992361 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.992279 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.992732 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.992891 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.992957 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.992977 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.992920 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.993367 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.993588 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.993672 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.993861 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.993911 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.994269 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.994304 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.994335 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.994393 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.994655 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.994719 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.995010 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.995445 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.995473 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.995593 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.995993 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.996045 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.996428 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.996471 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.996633 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.996693 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.997065 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.997247 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.993055 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.997352 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.997391 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.997870 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.997952 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.997989 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998026 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998061 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998093 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998121 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998153 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998185 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998220 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998253 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998292 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998325 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998353 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998383 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998422 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.998727 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:37 crc kubenswrapper[4803]: I0320 17:17:37.986318 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.998937 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.998987 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999042 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.998475 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999101 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999133 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999151 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999166 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999369 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999423 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999454 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999479 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999503 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999565 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999601 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999674 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:37.999944 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.000469 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.000574 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.000930 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.001278 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.001100 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.002516 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.002604 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.003145 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.001799 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.007990 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008193 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008234 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-etc-kubernetes\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008254 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008274 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-slash\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008294 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008314 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-cni-bin\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008332 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-daemon-config\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008351 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjqbr\" (UniqueName: \"kubernetes.io/projected/8510a852-14e1-4aba-826c-de9d4cfac290-kube-api-access-cjqbr\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008371 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-node-log\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008387 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlw5\" (UniqueName: \"kubernetes.io/projected/ec2c9586-ac9f-467a-a353-e43ac2a99797-kube-api-access-qrlw5\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008404 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-socket-dir-parent\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008420 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-netns\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008440 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008458 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008477 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008501 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-var-lib-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008537 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-cni-multus\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008555 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-config\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008572 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-cni-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008590 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008608 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-system-cni-dir\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008626 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-etc-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-ovn\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008660 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4326b171-36ab-465f-ba67-a636b36f1f89-ovn-node-metrics-cert\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008677 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-kubelet\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008695 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008713 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008730 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cnibin\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008748 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtw8\" (UniqueName: \"kubernetes.io/projected/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-kube-api-access-mbtw8\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008782 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2c9586-ac9f-467a-a353-e43ac2a99797-host\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008802 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ec2c9586-ac9f-467a-a353-e43ac2a99797-serviceca\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008867 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008890 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008909 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cni-binary-copy\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008931 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-netd\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008947 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-os-release\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008966 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-hostroot\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008983 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8510a852-14e1-4aba-826c-de9d4cfac290-mcd-auth-proxy-config\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009001 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p2gx\" (UniqueName: \"kubernetes.io/projected/7a26ad31-dca7-4b95-80a8-d8a3db949d1a-kube-api-access-6p2gx\") pod \"node-resolver-zp898\" (UID: \"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\") " pod="openshift-dns/node-resolver-zp898" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009020 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-log-socket\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009039 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-ovn-kubernetes\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009059 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-script-lib\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009076 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8510a852-14e1-4aba-826c-de9d4cfac290-rootfs\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009091 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-netns\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008991 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009107 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c909c3-a57a-4440-9052-48718b1d2dfd-cni-binary-copy\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009584 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-os-release\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008309 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008581 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.008653 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009031 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009130 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009255 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009398 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.009739 4803 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.009800 4803 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.009811 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:38.509784627 +0000 UTC m=+68.421376787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009870 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009909 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-bin\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009943 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.009981 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010009 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-conf-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010036 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-multus-certs\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.010047 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:38.510037265 +0000 UTC m=+68.421629495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010075 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-tuning-conf-dir\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010101 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8510a852-14e1-4aba-826c-de9d4cfac290-proxy-tls\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010641 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010713 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010746 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-env-overrides\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010776 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vhj\" (UniqueName: \"kubernetes.io/projected/4326b171-36ab-465f-ba67-a636b36f1f89-kube-api-access-s7vhj\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010823 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-cnibin\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010850 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22l6k\" (UniqueName: \"kubernetes.io/projected/55c909c3-a57a-4440-9052-48718b1d2dfd-kube-api-access-22l6k\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010890 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-kubelet\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010915 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-systemd-units\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010952 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-systemd\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.010983 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011021 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-system-cni-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011047 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-k8s-cni-cncf-io\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011084 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011109 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a26ad31-dca7-4b95-80a8-d8a3db949d1a-hosts-file\") pod \"node-resolver-zp898\" (UID: \"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\") " pod="openshift-dns/node-resolver-zp898" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011228 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011421 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011442 4803 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011456 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011452 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011574 4803 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.011622 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012078 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012102 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012285 4803 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012395 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012447 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012550 4803 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012682 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012852 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012995 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013035 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013191 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013217 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013325 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013359 4803 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013400 4803 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013746 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013759 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013769 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013799 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013811 4803 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013823 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013834 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013846 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013857 4803 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013866 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013876 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013886 4803 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013896 4803 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013906 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013915 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014165 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014180 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014189 4803 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014198 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014208 4803 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014217 4803 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014227 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014236 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014245 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014254 4803 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014263 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014273 4803 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014283 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014293 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014302 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014311 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014320 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014330 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014341 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014350 4803 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014360 4803 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014370 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014379 4803 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014389 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014398 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014407 4803 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014416 4803 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014427 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014436 4803 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014444 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014455 4803 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014465 4803 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014474 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014484 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014492 4803 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014500 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014514 4803 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014543 4803 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014554 4803 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014580 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014591 4803 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014961 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014976 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014987 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014996 4803 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015005 4803 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015014 4803 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015025 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015034 4803 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015042 4803 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015054 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015064 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015075 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015087 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015096 4803 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015106 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015116 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015125 4803 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015135 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015147 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015158 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015169 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015177 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015186 4803 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015196 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015205 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015215 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015224 4803 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015235 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015244 4803 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015254 4803 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015264 4803 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015274 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015283 4803 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015292 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015300 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015309 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015317 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015327 4803 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015335 4803 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015344 4803 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015353 4803 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015362 4803 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015371 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015379 4803 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015388 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015396 4803 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015406 4803 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015415 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015425 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015436 4803 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015446 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015455 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015464 4803 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015473 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015482 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015491 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015499 4803 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015509 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015533 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015548 4803 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015559 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015570 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015581 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015591 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015599 4803 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015609 4803 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015620 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015630 4803 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015641 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015652 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015663 4803 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015674 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012404 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012714 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012804 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.012820 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.013101 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.014727 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015027 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015793 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.015940 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.016394 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.017210 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.017252 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.017500 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.017656 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.017880 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.018019 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.018423 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.018536 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.018687 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.018890 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.018961 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.019385 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.019501 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.019565 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.019984 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.020245 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.020558 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.020826 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.020924 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.021178 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.021271 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.021416 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.021537 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.021657 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.022743 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.023350 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.023370 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.023389 4803 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.023439 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:38.523422417 +0000 UTC m=+68.435014487 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.027232 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.027919 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.028134 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.028659 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.029482 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.029677 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.030187 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.030419 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.030452 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.030469 4803 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.030590 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:38.530563909 +0000 UTC m=+68.442156039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.032172 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.032303 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.033180 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.033217 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.034918 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.036879 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.037481 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.038967 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.042780 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.046846 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.051565 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.052148 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.062830 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.069482 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116447 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-conf-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116495 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-multus-certs\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116578 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-tuning-conf-dir\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116592 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-conf-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116614 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8510a852-14e1-4aba-826c-de9d4cfac290-proxy-tls\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116616 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-multus-certs\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116638 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116685 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-env-overrides\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116738 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7vhj\" (UniqueName: \"kubernetes.io/projected/4326b171-36ab-465f-ba67-a636b36f1f89-kube-api-access-s7vhj\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116763 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-cnibin\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116790 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22l6k\" (UniqueName: \"kubernetes.io/projected/55c909c3-a57a-4440-9052-48718b1d2dfd-kube-api-access-22l6k\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116815 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116814 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-kubelet\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.116843 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-kubelet\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117023 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-cnibin\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117106 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-systemd-units\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117140 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-systemd\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117163 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117189 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-system-cni-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117215 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-k8s-cni-cncf-io\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117224 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-systemd\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117237 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a26ad31-dca7-4b95-80a8-d8a3db949d1a-hosts-file\") pod \"node-resolver-zp898\" (UID: \"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\") " pod="openshift-dns/node-resolver-zp898" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117266 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-etc-kubernetes\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117294 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-k8s-cni-cncf-io\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117285 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117318 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117294 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117352 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7a26ad31-dca7-4b95-80a8-d8a3db949d1a-hosts-file\") pod \"node-resolver-zp898\" (UID: \"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\") " pod="openshift-dns/node-resolver-zp898" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117264 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-systemd-units\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117369 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-etc-kubernetes\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117344 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-system-cni-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117398 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-cni-bin\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117437 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-slash\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117473 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjqbr\" (UniqueName: \"kubernetes.io/projected/8510a852-14e1-4aba-826c-de9d4cfac290-kube-api-access-cjqbr\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117487 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-cni-bin\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117499 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-slash\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117508 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-daemon-config\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117567 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlw5\" (UniqueName: \"kubernetes.io/projected/ec2c9586-ac9f-467a-a353-e43ac2a99797-kube-api-access-qrlw5\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117596 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-node-log\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117631 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-netns\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117686 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-socket-dir-parent\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117721 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-var-lib-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117751 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-cni-multus\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117781 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-run-netns\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117783 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-config\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117811 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-node-log\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117817 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-cni-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117837 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-var-lib-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117850 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-system-cni-dir\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117875 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-socket-dir-parent\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117886 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-etc-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117921 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-ovn\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117953 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4326b171-36ab-465f-ba67-a636b36f1f89-ovn-node-metrics-cert\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.117987 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-kubelet\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118026 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtw8\" (UniqueName: \"kubernetes.io/projected/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-kube-api-access-mbtw8\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118059 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2c9586-ac9f-467a-a353-e43ac2a99797-host\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118042 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-env-overrides\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118091 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ec2c9586-ac9f-467a-a353-e43ac2a99797-serviceca\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118141 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cnibin\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118175 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118199 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-cni-multus\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118214 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cni-binary-copy\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118252 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-netd\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118269 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-host-var-lib-kubelet\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118282 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-os-release\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118316 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-hostroot\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118334 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-ovn\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118348 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8510a852-14e1-4aba-826c-de9d4cfac290-mcd-auth-proxy-config\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118368 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-cni-dir\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118384 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p2gx\" (UniqueName: \"kubernetes.io/projected/7a26ad31-dca7-4b95-80a8-d8a3db949d1a-kube-api-access-6p2gx\") pod \"node-resolver-zp898\" (UID: \"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\") " pod="openshift-dns/node-resolver-zp898" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118619 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-system-cni-dir\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118623 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-log-socket\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118667 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-log-socket\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118690 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-ovn-kubernetes\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118717 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-script-lib\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118738 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8510a852-14e1-4aba-826c-de9d4cfac290-rootfs\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118761 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c909c3-a57a-4440-9052-48718b1d2dfd-cni-binary-copy\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118781 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-os-release\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118802 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-netns\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118824 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-bin\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118847 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118861 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ec2c9586-ac9f-467a-a353-e43ac2a99797-host\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118863 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-os-release\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118319 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-etc-openvswitch\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118283 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/55c909c3-a57a-4440-9052-48718b1d2dfd-multus-daemon-config\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118658 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-netd\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118932 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cnibin\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.118940 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-ovn-kubernetes\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.119026 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-config\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.119103 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.119127 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/8510a852-14e1-4aba-826c-de9d4cfac290-rootfs\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.119865 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cni-binary-copy\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120032 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8510a852-14e1-4aba-826c-de9d4cfac290-mcd-auth-proxy-config\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120074 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-os-release\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120092 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-bin\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120288 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120325 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-netns\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120355 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/55c909c3-a57a-4440-9052-48718b1d2dfd-hostroot\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120412 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120431 4803 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120445 4803 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120443 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/55c909c3-a57a-4440-9052-48718b1d2dfd-cni-binary-copy\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120470 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120498 4803 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120517 4803 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120560 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120654 4803 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120674 4803 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120692 4803 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121590 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-tuning-conf-dir\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121684 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121749 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121806 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121826 4803 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121843 4803 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121859 4803 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121875 4803 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121892 4803 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121911 4803 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121930 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121949 4803 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121966 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.121984 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122003 4803 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122019 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122036 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122053 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122070 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122092 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122112 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122130 4803 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122146 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122163 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122179 4803 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122198 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122216 4803 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122233 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122250 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122269 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122288 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122312 4803 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122327 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122345 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122362 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122380 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122396 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122412 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122429 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122445 4803 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.122461 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.120622 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-script-lib\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.123040 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ec2c9586-ac9f-467a-a353-e43ac2a99797-serviceca\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.123395 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8510a852-14e1-4aba-826c-de9d4cfac290-proxy-tls\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.123858 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4326b171-36ab-465f-ba67-a636b36f1f89-ovn-node-metrics-cert\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.135266 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7vhj\" (UniqueName: \"kubernetes.io/projected/4326b171-36ab-465f-ba67-a636b36f1f89-kube-api-access-s7vhj\") pod \"ovnkube-node-4v5dx\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.142381 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p2gx\" (UniqueName: \"kubernetes.io/projected/7a26ad31-dca7-4b95-80a8-d8a3db949d1a-kube-api-access-6p2gx\") pod \"node-resolver-zp898\" (UID: \"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\") " pod="openshift-dns/node-resolver-zp898" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.143894 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlw5\" (UniqueName: \"kubernetes.io/projected/ec2c9586-ac9f-467a-a353-e43ac2a99797-kube-api-access-qrlw5\") pod \"node-ca-l5swf\" (UID: \"ec2c9586-ac9f-467a-a353-e43ac2a99797\") " pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.144115 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjqbr\" (UniqueName: \"kubernetes.io/projected/8510a852-14e1-4aba-826c-de9d4cfac290-kube-api-access-cjqbr\") pod \"machine-config-daemon-26nll\" (UID: \"8510a852-14e1-4aba-826c-de9d4cfac290\") " pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.145793 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.148235 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtw8\" (UniqueName: \"kubernetes.io/projected/dad4e80b-88f1-4e64-a7e6-136c1d3b6e67-kube-api-access-mbtw8\") pod \"multus-additional-cni-plugins-56rll\" (UID: \"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\") " pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.149358 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22l6k\" (UniqueName: \"kubernetes.io/projected/55c909c3-a57a-4440-9052-48718b1d2dfd-kube-api-access-22l6k\") pod \"multus-d8jn6\" (UID: \"55c909c3-a57a-4440-9052-48718b1d2dfd\") " pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.160973 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:38 crc kubenswrapper[4803]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:38 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:38 crc kubenswrapper[4803]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:17:38 crc kubenswrapper[4803]: source /etc/kubernetes/apiserver-url.env Mar 20 17:17:38 crc kubenswrapper[4803]: else Mar 20 17:17:38 crc kubenswrapper[4803]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:17:38 crc kubenswrapper[4803]: exit 1 Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:17:38 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:38 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.162302 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.166550 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.167051 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8b2f3168bd199a44ea347edceae015c494c9f9434d73d6ccb5b29d92da11eb1d"} Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.168384 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:38 crc kubenswrapper[4803]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:38 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:38 crc kubenswrapper[4803]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:17:38 crc kubenswrapper[4803]: source /etc/kubernetes/apiserver-url.env Mar 20 17:17:38 crc kubenswrapper[4803]: else Mar 20 17:17:38 crc kubenswrapper[4803]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:17:38 crc kubenswrapper[4803]: exit 1 Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:17:38 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:38 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.168720 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.169633 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.170787 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a"} Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.171117 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.179837 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: W0320 17:17:38.181351 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-13ffdc2331540299e1752f348aebd62b70b19c3df70360c0c1a4b50b8f4012c0 WatchSource:0}: Error finding container 13ffdc2331540299e1752f348aebd62b70b19c3df70360c0c1a4b50b8f4012c0: Status 404 returned error can't find the container with id 13ffdc2331540299e1752f348aebd62b70b19c3df70360c0c1a4b50b8f4012c0 Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.183456 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:38 crc kubenswrapper[4803]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:38 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:38 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:38 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:38 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:17:38 crc kubenswrapper[4803]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:17:38 crc kubenswrapper[4803]: ho_enable="--enable-hybrid-overlay" Mar 20 17:17:38 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:17:38 crc kubenswrapper[4803]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:17:38 crc kubenswrapper[4803]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:17:38 crc kubenswrapper[4803]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:17:38 crc kubenswrapper[4803]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:17:38 crc kubenswrapper[4803]: --webhook-host=127.0.0.1 \ Mar 20 17:17:38 crc kubenswrapper[4803]: --webhook-port=9743 \ Mar 20 17:17:38 crc kubenswrapper[4803]: ${ho_enable} \ Mar 20 17:17:38 crc kubenswrapper[4803]: --enable-interconnect \ Mar 20 17:17:38 crc kubenswrapper[4803]: --disable-approver \ Mar 20 17:17:38 crc kubenswrapper[4803]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:17:38 crc kubenswrapper[4803]: --wait-for-kubernetes-api=200s \ Mar 20 17:17:38 crc kubenswrapper[4803]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:17:38 crc kubenswrapper[4803]: --loglevel="${LOGLEVEL}" Mar 20 17:17:38 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:38 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.185719 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zp898" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.188037 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.188276 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:38 crc kubenswrapper[4803]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:38 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:38 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:38 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:38 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:17:38 crc kubenswrapper[4803]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:17:38 crc kubenswrapper[4803]: --disable-webhook \ Mar 20 17:17:38 crc kubenswrapper[4803]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:17:38 crc kubenswrapper[4803]: --loglevel="${LOGLEVEL}" Mar 20 17:17:38 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:38 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.189645 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:17:38 crc kubenswrapper[4803]: W0320 17:17:38.196351 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a26ad31_dca7_4b95_80a8_d8a3db949d1a.slice/crio-55955554a3ac1af6ea1488677f7c0e8573c14643ed7ccf5f90fa08a5bf454d27 WatchSource:0}: Error finding container 55955554a3ac1af6ea1488677f7c0e8573c14643ed7ccf5f90fa08a5bf454d27: Status 404 returned error can't find the container with id 55955554a3ac1af6ea1488677f7c0e8573c14643ed7ccf5f90fa08a5bf454d27 Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.197255 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.199211 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:38 crc kubenswrapper[4803]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:38 crc kubenswrapper[4803]: set -uo pipefail Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 17:17:38 crc kubenswrapper[4803]: HOSTS_FILE="/etc/hosts" Mar 20 17:17:38 crc kubenswrapper[4803]: TEMP_FILE="/etc/hosts.tmp" Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: # Make a temporary file with the old hosts file's attributes. Mar 20 17:17:38 crc kubenswrapper[4803]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 17:17:38 crc kubenswrapper[4803]: echo "Failed to preserve hosts file. Exiting." Mar 20 17:17:38 crc kubenswrapper[4803]: exit 1 Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: while true; do Mar 20 17:17:38 crc kubenswrapper[4803]: declare -A svc_ips Mar 20 17:17:38 crc kubenswrapper[4803]: for svc in "${services[@]}"; do Mar 20 17:17:38 crc kubenswrapper[4803]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 17:17:38 crc kubenswrapper[4803]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 17:17:38 crc kubenswrapper[4803]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 17:17:38 crc kubenswrapper[4803]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 17:17:38 crc kubenswrapper[4803]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:38 crc kubenswrapper[4803]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:38 crc kubenswrapper[4803]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:38 crc kubenswrapper[4803]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 17:17:38 crc kubenswrapper[4803]: for i in ${!cmds[*]} Mar 20 17:17:38 crc kubenswrapper[4803]: do Mar 20 17:17:38 crc kubenswrapper[4803]: ips=($(eval "${cmds[i]}")) Mar 20 17:17:38 crc kubenswrapper[4803]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 17:17:38 crc kubenswrapper[4803]: svc_ips["${svc}"]="${ips[@]}" Mar 20 17:17:38 crc kubenswrapper[4803]: break Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: done Mar 20 17:17:38 crc kubenswrapper[4803]: done Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: # Update /etc/hosts only if we get valid service IPs Mar 20 17:17:38 crc kubenswrapper[4803]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 17:17:38 crc kubenswrapper[4803]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 17:17:38 crc kubenswrapper[4803]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 17:17:38 crc kubenswrapper[4803]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 17:17:38 crc kubenswrapper[4803]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 17:17:38 crc kubenswrapper[4803]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 17:17:38 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:38 crc kubenswrapper[4803]: continue Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: # Append resolver entries for services Mar 20 17:17:38 crc kubenswrapper[4803]: rc=0 Mar 20 17:17:38 crc kubenswrapper[4803]: for svc in "${!svc_ips[@]}"; do Mar 20 17:17:38 crc kubenswrapper[4803]: for ip in ${svc_ips[${svc}]}; do Mar 20 17:17:38 crc kubenswrapper[4803]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 17:17:38 crc kubenswrapper[4803]: done Mar 20 17:17:38 crc kubenswrapper[4803]: done Mar 20 17:17:38 crc kubenswrapper[4803]: if [[ $rc -ne 0 ]]; then Mar 20 17:17:38 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:38 crc kubenswrapper[4803]: continue Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: Mar 20 17:17:38 crc kubenswrapper[4803]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 17:17:38 crc kubenswrapper[4803]: # Replace /etc/hosts with our modified version if needed Mar 20 17:17:38 crc kubenswrapper[4803]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 17:17:38 crc kubenswrapper[4803]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:38 crc kubenswrapper[4803]: unset svc_ips Mar 20 17:17:38 crc kubenswrapper[4803]: done Mar 20 17:17:38 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6p2gx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zp898_openshift-dns(7a26ad31-dca7-4b95-80a8-d8a3db949d1a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:38 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.201233 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zp898" podUID="7a26ad31-dca7-4b95-80a8-d8a3db949d1a" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.205266 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.214320 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.218800 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.227766 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: W0320 17:17:38.234238 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4326b171_36ab_465f_ba67_a636b36f1f89.slice/crio-7bf0ad411aece19fe6b688f92a0d4012aa4ba9929b264e80a04aca998b52d8f9 WatchSource:0}: Error finding container 7bf0ad411aece19fe6b688f92a0d4012aa4ba9929b264e80a04aca998b52d8f9: Status 404 returned error can't find the container with id 7bf0ad411aece19fe6b688f92a0d4012aa4ba9929b264e80a04aca998b52d8f9 Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.236632 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:38 crc kubenswrapper[4803]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 17:17:38 crc kubenswrapper[4803]: apiVersion: v1 Mar 20 17:17:38 crc kubenswrapper[4803]: clusters: Mar 20 17:17:38 crc kubenswrapper[4803]: - cluster: Mar 20 17:17:38 crc kubenswrapper[4803]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 17:17:38 crc kubenswrapper[4803]: server: https://api-int.crc.testing:6443 Mar 20 17:17:38 crc kubenswrapper[4803]: name: default-cluster Mar 20 17:17:38 crc kubenswrapper[4803]: contexts: Mar 20 17:17:38 crc kubenswrapper[4803]: - context: Mar 20 17:17:38 crc kubenswrapper[4803]: cluster: default-cluster Mar 20 17:17:38 crc kubenswrapper[4803]: namespace: default Mar 20 17:17:38 crc kubenswrapper[4803]: user: default-auth Mar 20 17:17:38 crc kubenswrapper[4803]: name: default-context Mar 20 17:17:38 crc kubenswrapper[4803]: current-context: default-context Mar 20 17:17:38 crc kubenswrapper[4803]: kind: Config Mar 20 17:17:38 crc kubenswrapper[4803]: preferences: {} Mar 20 17:17:38 crc kubenswrapper[4803]: users: Mar 20 17:17:38 crc kubenswrapper[4803]: - name: default-auth Mar 20 17:17:38 crc kubenswrapper[4803]: user: Mar 20 17:17:38 crc kubenswrapper[4803]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:17:38 crc kubenswrapper[4803]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:17:38 crc kubenswrapper[4803]: EOF Mar 20 17:17:38 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s7vhj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-4v5dx_openshift-ovn-kubernetes(4326b171-36ab-465f-ba67-a636b36f1f89): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:38 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.237832 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.239005 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-56rll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.244901 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.249711 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.252672 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.258235 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l5swf" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.261602 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.263612 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-d8jn6" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.264174 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbtw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-56rll_openshift-multus(dad4e80b-88f1-4e64-a7e6-136c1d3b6e67): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: W0320 17:17:38.264302 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8510a852_14e1_4aba_826c_de9d4cfac290.slice/crio-db333556be22b3709fa474d20989e7386ef016b5416588a0d7e752363f81fc55 WatchSource:0}: Error finding container db333556be22b3709fa474d20989e7386ef016b5416588a0d7e752363f81fc55: Status 404 returned error can't find the container with id db333556be22b3709fa474d20989e7386ef016b5416588a0d7e752363f81fc55 Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.265670 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-56rll" podUID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.272434 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: W0320 17:17:38.277976 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-0925e936c6f38845dea711d6fa503e05920b9c6a8bed258552109e68ac05402d WatchSource:0}: Error finding container 0925e936c6f38845dea711d6fa503e05920b9c6a8bed258552109e68ac05402d: Status 404 returned error can't find the container with id 0925e936c6f38845dea711d6fa503e05920b9c6a8bed258552109e68ac05402d Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.289091 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjqbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.290194 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.292817 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.294165 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.294441 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjqbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: W0320 17:17:38.294797 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2c9586_ac9f_467a_a353_e43ac2a99797.slice/crio-0bbfc26296ee880b3d42dd46dd3d506155387247fed748965906ae29d3f32101 WatchSource:0}: Error finding container 0bbfc26296ee880b3d42dd46dd3d506155387247fed748965906ae29d3f32101: Status 404 returned error can't find the container with id 0bbfc26296ee880b3d42dd46dd3d506155387247fed748965906ae29d3f32101 Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.296256 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.299354 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:38 crc kubenswrapper[4803]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 17:17:38 crc kubenswrapper[4803]: while [ true ]; Mar 20 17:17:38 crc kubenswrapper[4803]: do Mar 20 17:17:38 crc kubenswrapper[4803]: for f in $(ls /tmp/serviceca); do Mar 20 17:17:38 crc kubenswrapper[4803]: echo $f Mar 20 17:17:38 crc kubenswrapper[4803]: ca_file_path="/tmp/serviceca/${f}" Mar 20 17:17:38 crc kubenswrapper[4803]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 17:17:38 crc kubenswrapper[4803]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 17:17:38 crc kubenswrapper[4803]: if [ -e "${reg_dir_path}" ]; then Mar 20 17:17:38 crc kubenswrapper[4803]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 17:17:38 crc kubenswrapper[4803]: else Mar 20 17:17:38 crc kubenswrapper[4803]: mkdir $reg_dir_path Mar 20 17:17:38 crc kubenswrapper[4803]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: done Mar 20 17:17:38 crc kubenswrapper[4803]: for d in $(ls /etc/docker/certs.d); do Mar 20 17:17:38 crc kubenswrapper[4803]: echo $d Mar 20 17:17:38 crc kubenswrapper[4803]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 17:17:38 crc kubenswrapper[4803]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 17:17:38 crc kubenswrapper[4803]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 17:17:38 crc kubenswrapper[4803]: rm -rf /etc/docker/certs.d/$d Mar 20 17:17:38 crc kubenswrapper[4803]: fi Mar 20 17:17:38 crc kubenswrapper[4803]: done Mar 20 17:17:38 crc kubenswrapper[4803]: sleep 60 & wait ${!} Mar 20 17:17:38 crc kubenswrapper[4803]: done Mar 20 17:17:38 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrlw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-l5swf_openshift-image-registry(ec2c9586-ac9f-467a-a353-e43ac2a99797): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:38 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.300455 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-l5swf" podUID="ec2c9586-ac9f-467a-a353-e43ac2a99797" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.301702 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.302946 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:38 crc kubenswrapper[4803]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 17:17:38 crc kubenswrapper[4803]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 17:17:38 crc kubenswrapper[4803]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22l6k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-d8jn6_openshift-multus(55c909c3-a57a-4440-9052-48718b1d2dfd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:38 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.304501 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-d8jn6" podUID="55c909c3-a57a-4440-9052-48718b1d2dfd" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.313176 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.323793 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.331832 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.345188 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.354784 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.364890 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.390204 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.398681 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.411812 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.429672 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.442733 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.464268 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.483718 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.496564 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.510665 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.526931 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.527040 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.527155 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:17:39.527093059 +0000 UTC m=+69.438685169 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.527177 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.527197 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.527212 4803 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.527276 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:39.527257524 +0000 UTC m=+69.438849674 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.527488 4803 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.527603 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:39.527586506 +0000 UTC m=+69.439178616 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.527343 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.527746 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.527937 4803 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.528057 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:39.528040441 +0000 UTC m=+69.439632541 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.628370 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.628599 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.628649 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.628665 4803 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:38 crc kubenswrapper[4803]: E0320 17:17:38.628702 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:39.628690582 +0000 UTC m=+69.540282662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.851950 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.852712 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.854116 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.854969 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.856637 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.857334 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.858056 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.858740 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.859521 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.861103 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.862383 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.864419 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.864969 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.865854 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.866339 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.867278 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.867963 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.868457 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.869644 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.870317 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.870854 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.872023 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.872555 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.873541 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.874052 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.875097 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.875765 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.876208 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.877176 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.877633 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.878400 4803 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.878495 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.880095 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.881062 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.881443 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.882890 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.884043 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.884743 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.885974 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.886764 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.887327 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.888295 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.889233 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.889896 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.890710 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.891260 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.892093 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.892856 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.893865 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.894339 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.894958 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.895856 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.896434 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 17:17:38 crc kubenswrapper[4803]: I0320 17:17:38.897439 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.147868 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd"] Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.148474 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.151902 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.152986 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.165977 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.173185 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"7bf0ad411aece19fe6b688f92a0d4012aa4ba9929b264e80a04aca998b52d8f9"} Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.174458 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:39 crc kubenswrapper[4803]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 17:17:39 crc kubenswrapper[4803]: apiVersion: v1 Mar 20 17:17:39 crc kubenswrapper[4803]: clusters: Mar 20 17:17:39 crc kubenswrapper[4803]: - cluster: Mar 20 17:17:39 crc kubenswrapper[4803]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 17:17:39 crc kubenswrapper[4803]: server: https://api-int.crc.testing:6443 Mar 20 17:17:39 crc kubenswrapper[4803]: name: default-cluster Mar 20 17:17:39 crc kubenswrapper[4803]: contexts: Mar 20 17:17:39 crc kubenswrapper[4803]: - context: Mar 20 17:17:39 crc kubenswrapper[4803]: cluster: default-cluster Mar 20 17:17:39 crc kubenswrapper[4803]: namespace: default Mar 20 17:17:39 crc kubenswrapper[4803]: user: default-auth Mar 20 17:17:39 crc kubenswrapper[4803]: name: default-context Mar 20 17:17:39 crc kubenswrapper[4803]: current-context: default-context Mar 20 17:17:39 crc kubenswrapper[4803]: kind: Config Mar 20 17:17:39 crc kubenswrapper[4803]: preferences: {} Mar 20 17:17:39 crc kubenswrapper[4803]: users: Mar 20 17:17:39 crc kubenswrapper[4803]: - name: default-auth Mar 20 17:17:39 crc kubenswrapper[4803]: user: Mar 20 17:17:39 crc kubenswrapper[4803]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:17:39 crc kubenswrapper[4803]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 17:17:39 crc kubenswrapper[4803]: EOF Mar 20 17:17:39 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s7vhj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-4v5dx_openshift-ovn-kubernetes(4326b171-36ab-465f-ba67-a636b36f1f89): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:39 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.175020 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zp898" event={"ID":"7a26ad31-dca7-4b95-80a8-d8a3db949d1a","Type":"ContainerStarted","Data":"55955554a3ac1af6ea1488677f7c0e8573c14643ed7ccf5f90fa08a5bf454d27"} Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.175668 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.175897 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:39 crc kubenswrapper[4803]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:39 crc kubenswrapper[4803]: set -uo pipefail Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 17:17:39 crc kubenswrapper[4803]: HOSTS_FILE="/etc/hosts" Mar 20 17:17:39 crc kubenswrapper[4803]: TEMP_FILE="/etc/hosts.tmp" Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: # Make a temporary file with the old hosts file's attributes. Mar 20 17:17:39 crc kubenswrapper[4803]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 17:17:39 crc kubenswrapper[4803]: echo "Failed to preserve hosts file. Exiting." Mar 20 17:17:39 crc kubenswrapper[4803]: exit 1 Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: while true; do Mar 20 17:17:39 crc kubenswrapper[4803]: declare -A svc_ips Mar 20 17:17:39 crc kubenswrapper[4803]: for svc in "${services[@]}"; do Mar 20 17:17:39 crc kubenswrapper[4803]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 17:17:39 crc kubenswrapper[4803]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 17:17:39 crc kubenswrapper[4803]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 17:17:39 crc kubenswrapper[4803]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 17:17:39 crc kubenswrapper[4803]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:39 crc kubenswrapper[4803]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:39 crc kubenswrapper[4803]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:39 crc kubenswrapper[4803]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 17:17:39 crc kubenswrapper[4803]: for i in ${!cmds[*]} Mar 20 17:17:39 crc kubenswrapper[4803]: do Mar 20 17:17:39 crc kubenswrapper[4803]: ips=($(eval "${cmds[i]}")) Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: svc_ips["${svc}"]="${ips[@]}" Mar 20 17:17:39 crc kubenswrapper[4803]: break Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: # Update /etc/hosts only if we get valid service IPs Mar 20 17:17:39 crc kubenswrapper[4803]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 17:17:39 crc kubenswrapper[4803]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 17:17:39 crc kubenswrapper[4803]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 17:17:39 crc kubenswrapper[4803]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 17:17:39 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:39 crc kubenswrapper[4803]: continue Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: # Append resolver entries for services Mar 20 17:17:39 crc kubenswrapper[4803]: rc=0 Mar 20 17:17:39 crc kubenswrapper[4803]: for svc in "${!svc_ips[@]}"; do Mar 20 17:17:39 crc kubenswrapper[4803]: for ip in ${svc_ips[${svc}]}; do Mar 20 17:17:39 crc kubenswrapper[4803]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ $rc -ne 0 ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:39 crc kubenswrapper[4803]: continue Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 17:17:39 crc kubenswrapper[4803]: # Replace /etc/hosts with our modified version if needed Mar 20 17:17:39 crc kubenswrapper[4803]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 17:17:39 crc kubenswrapper[4803]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:39 crc kubenswrapper[4803]: unset svc_ips Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6p2gx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zp898_openshift-dns(7a26ad31-dca7-4b95-80a8-d8a3db949d1a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:39 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.176315 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d8jn6" event={"ID":"55c909c3-a57a-4440-9052-48718b1d2dfd","Type":"ContainerStarted","Data":"e0eff1993de21f9cb73a679b28656667f64ac31a2943baa9c91cd805f7fbab11"} Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.177336 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zp898" podUID="7a26ad31-dca7-4b95-80a8-d8a3db949d1a" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.177704 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:39 crc kubenswrapper[4803]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 17:17:39 crc kubenswrapper[4803]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 17:17:39 crc kubenswrapper[4803]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22l6k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-d8jn6_openshift-multus(55c909c3-a57a-4440-9052-48718b1d2dfd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:39 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.178245 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0925e936c6f38845dea711d6fa503e05920b9c6a8bed258552109e68ac05402d"} Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.178611 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.178802 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-d8jn6" podUID="55c909c3-a57a-4440-9052-48718b1d2dfd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.179032 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"db333556be22b3709fa474d20989e7386ef016b5416588a0d7e752363f81fc55"} Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.179847 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.180054 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjqbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.180167 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"13ffdc2331540299e1752f348aebd62b70b19c3df70360c0c1a4b50b8f4012c0"} Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.181302 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.181483 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerStarted","Data":"f7de67e450b0a5f8aecbffcf6f941b2fd898446a82f365617e56e105a6c8d179"} Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.181512 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:39 crc kubenswrapper[4803]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:39 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:39 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:17:39 crc kubenswrapper[4803]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:17:39 crc kubenswrapper[4803]: ho_enable="--enable-hybrid-overlay" Mar 20 17:17:39 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:17:39 crc kubenswrapper[4803]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:17:39 crc kubenswrapper[4803]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:17:39 crc kubenswrapper[4803]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:17:39 crc kubenswrapper[4803]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:17:39 crc kubenswrapper[4803]: --webhook-host=127.0.0.1 \ Mar 20 17:17:39 crc kubenswrapper[4803]: --webhook-port=9743 \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${ho_enable} \ Mar 20 17:17:39 crc kubenswrapper[4803]: --enable-interconnect \ Mar 20 17:17:39 crc kubenswrapper[4803]: --disable-approver \ Mar 20 17:17:39 crc kubenswrapper[4803]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:17:39 crc kubenswrapper[4803]: --wait-for-kubernetes-api=200s \ Mar 20 17:17:39 crc kubenswrapper[4803]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:17:39 crc kubenswrapper[4803]: --loglevel="${LOGLEVEL}" Mar 20 17:17:39 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:39 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.182441 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjqbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.182606 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l5swf" event={"ID":"ec2c9586-ac9f-467a-a353-e43ac2a99797","Type":"ContainerStarted","Data":"0bbfc26296ee880b3d42dd46dd3d506155387247fed748965906ae29d3f32101"} Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.183125 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbtw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-56rll_openshift-multus(dad4e80b-88f1-4e64-a7e6-136c1d3b6e67): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.183423 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:39 crc kubenswrapper[4803]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:39 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:39 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:17:39 crc kubenswrapper[4803]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:17:39 crc kubenswrapper[4803]: --disable-webhook \ Mar 20 17:17:39 crc kubenswrapper[4803]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:17:39 crc kubenswrapper[4803]: --loglevel="${LOGLEVEL}" Mar 20 17:17:39 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:39 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.184314 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.185095 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-56rll" podUID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.185170 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.185245 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:39 crc kubenswrapper[4803]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 17:17:39 crc kubenswrapper[4803]: while [ true ]; Mar 20 17:17:39 crc kubenswrapper[4803]: do Mar 20 17:17:39 crc kubenswrapper[4803]: for f in $(ls /tmp/serviceca); do Mar 20 17:17:39 crc kubenswrapper[4803]: echo $f Mar 20 17:17:39 crc kubenswrapper[4803]: ca_file_path="/tmp/serviceca/${f}" Mar 20 17:17:39 crc kubenswrapper[4803]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 17:17:39 crc kubenswrapper[4803]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 17:17:39 crc kubenswrapper[4803]: if [ -e "${reg_dir_path}" ]; then Mar 20 17:17:39 crc kubenswrapper[4803]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 17:17:39 crc kubenswrapper[4803]: else Mar 20 17:17:39 crc kubenswrapper[4803]: mkdir $reg_dir_path Mar 20 17:17:39 crc kubenswrapper[4803]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: for d in $(ls /etc/docker/certs.d); do Mar 20 17:17:39 crc kubenswrapper[4803]: echo $d Mar 20 17:17:39 crc kubenswrapper[4803]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 17:17:39 crc kubenswrapper[4803]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 17:17:39 crc kubenswrapper[4803]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 17:17:39 crc kubenswrapper[4803]: rm -rf /etc/docker/certs.d/$d Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: sleep 60 & wait ${!} Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrlw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-l5swf_openshift-image-registry(ec2c9586-ac9f-467a-a353-e43ac2a99797): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:39 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.186842 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-l5swf" podUID="ec2c9586-ac9f-467a-a353-e43ac2a99797" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.189640 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.200676 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.212952 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.225369 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.233888 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.246195 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nsx6\" (UniqueName: \"kubernetes.io/projected/332635bf-724e-479c-86bc-d08bb83cd6ef-kube-api-access-6nsx6\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.246837 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/332635bf-724e-479c-86bc-d08bb83cd6ef-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.246863 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.247748 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/332635bf-724e-479c-86bc-d08bb83cd6ef-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.248799 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/332635bf-724e-479c-86bc-d08bb83cd6ef-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.260599 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.275925 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.298163 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.313728 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.329059 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.343840 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.350198 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/332635bf-724e-479c-86bc-d08bb83cd6ef-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.350260 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nsx6\" (UniqueName: \"kubernetes.io/projected/332635bf-724e-479c-86bc-d08bb83cd6ef-kube-api-access-6nsx6\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.350324 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/332635bf-724e-479c-86bc-d08bb83cd6ef-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.350374 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/332635bf-724e-479c-86bc-d08bb83cd6ef-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.351035 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/332635bf-724e-479c-86bc-d08bb83cd6ef-env-overrides\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.351416 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/332635bf-724e-479c-86bc-d08bb83cd6ef-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.354729 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/332635bf-724e-479c-86bc-d08bb83cd6ef-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.357013 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.364056 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.369921 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nsx6\" (UniqueName: \"kubernetes.io/projected/332635bf-724e-479c-86bc-d08bb83cd6ef-kube-api-access-6nsx6\") pod \"ovnkube-control-plane-749d76644c-q5phd\" (UID: \"332635bf-724e-479c-86bc-d08bb83cd6ef\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.376970 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.386814 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.395260 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.406051 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.417075 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.436090 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.444646 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.458458 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.468170 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.470461 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.481830 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: W0320 17:17:39.485473 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod332635bf_724e_479c_86bc_d08bb83cd6ef.slice/crio-981fcb3c3eca2e6dba9199c52b1c0c5a9ccadf1348ade21d51a048bfa50b949f WatchSource:0}: Error finding container 981fcb3c3eca2e6dba9199c52b1c0c5a9ccadf1348ade21d51a048bfa50b949f: Status 404 returned error can't find the container with id 981fcb3c3eca2e6dba9199c52b1c0c5a9ccadf1348ade21d51a048bfa50b949f Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.488197 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:39 crc kubenswrapper[4803]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:39 crc kubenswrapper[4803]: set -euo pipefail Mar 20 17:17:39 crc kubenswrapper[4803]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 17:17:39 crc kubenswrapper[4803]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 17:17:39 crc kubenswrapper[4803]: # As the secret mount is optional we must wait for the files to be present. Mar 20 17:17:39 crc kubenswrapper[4803]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 17:17:39 crc kubenswrapper[4803]: TS=$(date +%s) Mar 20 17:17:39 crc kubenswrapper[4803]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 17:17:39 crc kubenswrapper[4803]: HAS_LOGGED_INFO=0 Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: log_missing_certs(){ Mar 20 17:17:39 crc kubenswrapper[4803]: CUR_TS=$(date +%s) Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 17:17:39 crc kubenswrapper[4803]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 17:17:39 crc kubenswrapper[4803]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 17:17:39 crc kubenswrapper[4803]: HAS_LOGGED_INFO=1 Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: } Mar 20 17:17:39 crc kubenswrapper[4803]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 17:17:39 crc kubenswrapper[4803]: log_missing_certs Mar 20 17:17:39 crc kubenswrapper[4803]: sleep 5 Mar 20 17:17:39 crc kubenswrapper[4803]: done Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 17:17:39 crc kubenswrapper[4803]: exec /usr/bin/kube-rbac-proxy \ Mar 20 17:17:39 crc kubenswrapper[4803]: --logtostderr \ Mar 20 17:17:39 crc kubenswrapper[4803]: --secure-listen-address=:9108 \ Mar 20 17:17:39 crc kubenswrapper[4803]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 17:17:39 crc kubenswrapper[4803]: --upstream=http://127.0.0.1:29108/ \ Mar 20 17:17:39 crc kubenswrapper[4803]: --tls-private-key-file=${TLS_PK} \ Mar 20 17:17:39 crc kubenswrapper[4803]: --tls-cert-file=${TLS_CERT} Mar 20 17:17:39 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nsx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q5phd_openshift-ovn-kubernetes(332635bf-724e-479c-86bc-d08bb83cd6ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:39 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.490269 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:39 crc kubenswrapper[4803]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:39 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:39 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: ovn_v4_join_subnet_opt= Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: ovn_v6_join_subnet_opt= Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: ovn_v4_transit_switch_subnet_opt= Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: ovn_v6_transit_switch_subnet_opt= Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: dns_name_resolver_enabled_flag= Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "false" == "true" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: persistent_ips_enabled_flag= Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "true" == "true" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: # This is needed so that converting clusters from GA to TP Mar 20 17:17:39 crc kubenswrapper[4803]: # will rollout control plane pods as well Mar 20 17:17:39 crc kubenswrapper[4803]: network_segmentation_enabled_flag= Mar 20 17:17:39 crc kubenswrapper[4803]: multi_network_enabled_flag= Mar 20 17:17:39 crc kubenswrapper[4803]: if [[ "true" == "true" ]]; then Mar 20 17:17:39 crc kubenswrapper[4803]: multi_network_enabled_flag="--enable-multi-network" Mar 20 17:17:39 crc kubenswrapper[4803]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 17:17:39 crc kubenswrapper[4803]: fi Mar 20 17:17:39 crc kubenswrapper[4803]: Mar 20 17:17:39 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 17:17:39 crc kubenswrapper[4803]: exec /usr/bin/ovnkube \ Mar 20 17:17:39 crc kubenswrapper[4803]: --enable-interconnect \ Mar 20 17:17:39 crc kubenswrapper[4803]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 17:17:39 crc kubenswrapper[4803]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 17:17:39 crc kubenswrapper[4803]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 17:17:39 crc kubenswrapper[4803]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 17:17:39 crc kubenswrapper[4803]: --metrics-enable-pprof \ Mar 20 17:17:39 crc kubenswrapper[4803]: --metrics-enable-config-duration \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${ovn_v4_join_subnet_opt} \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${ovn_v6_join_subnet_opt} \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${dns_name_resolver_enabled_flag} \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${persistent_ips_enabled_flag} \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${multi_network_enabled_flag} \ Mar 20 17:17:39 crc kubenswrapper[4803]: ${network_segmentation_enabled_flag} Mar 20 17:17:39 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nsx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q5phd_openshift-ovn-kubernetes(332635bf-724e-479c-86bc-d08bb83cd6ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:39 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.491396 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" podUID="332635bf-724e-479c-86bc-d08bb83cd6ef" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.514929 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.552177 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.552297 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552331 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:17:41.552304226 +0000 UTC m=+71.463896296 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552385 4803 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.552424 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552433 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:41.55242117 +0000 UTC m=+71.464013240 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.552491 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552664 4803 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.552722 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552760 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:41.552741951 +0000 UTC m=+71.464334061 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552680 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552815 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552837 4803 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.552889 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:41.552876285 +0000 UTC m=+71.464468385 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.653177 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.653313 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.653333 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.653347 4803 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.653392 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:41.653378912 +0000 UTC m=+71.564970982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.847368 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.847377 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.847515 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.847395 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.847704 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.847639 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.880731 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-llxn2"] Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.881097 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:39 crc kubenswrapper[4803]: E0320 17:17:39.881140 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.890857 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.940242 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.952087 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.956410 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.956495 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nsd\" (UniqueName: \"kubernetes.io/projected/56b68c7b-2d4d-4628-9f1f-85ec48141f82-kube-api-access-v6nsd\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.963557 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.973755 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:39 crc kubenswrapper[4803]: I0320 17:17:39.993677 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.006024 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.023922 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.041151 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.051492 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.057520 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nsd\" (UniqueName: \"kubernetes.io/projected/56b68c7b-2d4d-4628-9f1f-85ec48141f82-kube-api-access-v6nsd\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.057799 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.057873 4803 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.058272 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs podName:56b68c7b-2d4d-4628-9f1f-85ec48141f82 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:40.558240454 +0000 UTC m=+70.469832564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs") pod "network-metrics-daemon-llxn2" (UID: "56b68c7b-2d4d-4628-9f1f-85ec48141f82") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.062406 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.071918 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.087687 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nsd\" (UniqueName: \"kubernetes.io/projected/56b68c7b-2d4d-4628-9f1f-85ec48141f82-kube-api-access-v6nsd\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.095703 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.135634 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.174748 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.185616 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" event={"ID":"332635bf-724e-479c-86bc-d08bb83cd6ef","Type":"ContainerStarted","Data":"981fcb3c3eca2e6dba9199c52b1c0c5a9ccadf1348ade21d51a048bfa50b949f"} Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.187253 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:40 crc kubenswrapper[4803]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:40 crc kubenswrapper[4803]: set -euo pipefail Mar 20 17:17:40 crc kubenswrapper[4803]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 17:17:40 crc kubenswrapper[4803]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 17:17:40 crc kubenswrapper[4803]: # As the secret mount is optional we must wait for the files to be present. Mar 20 17:17:40 crc kubenswrapper[4803]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 17:17:40 crc kubenswrapper[4803]: TS=$(date +%s) Mar 20 17:17:40 crc kubenswrapper[4803]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 17:17:40 crc kubenswrapper[4803]: HAS_LOGGED_INFO=0 Mar 20 17:17:40 crc kubenswrapper[4803]: Mar 20 17:17:40 crc kubenswrapper[4803]: log_missing_certs(){ Mar 20 17:17:40 crc kubenswrapper[4803]: CUR_TS=$(date +%s) Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 17:17:40 crc kubenswrapper[4803]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 17:17:40 crc kubenswrapper[4803]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 17:17:40 crc kubenswrapper[4803]: HAS_LOGGED_INFO=1 Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: } Mar 20 17:17:40 crc kubenswrapper[4803]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 17:17:40 crc kubenswrapper[4803]: log_missing_certs Mar 20 17:17:40 crc kubenswrapper[4803]: sleep 5 Mar 20 17:17:40 crc kubenswrapper[4803]: done Mar 20 17:17:40 crc kubenswrapper[4803]: Mar 20 17:17:40 crc kubenswrapper[4803]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 17:17:40 crc kubenswrapper[4803]: exec /usr/bin/kube-rbac-proxy \ Mar 20 17:17:40 crc kubenswrapper[4803]: --logtostderr \ Mar 20 17:17:40 crc kubenswrapper[4803]: --secure-listen-address=:9108 \ Mar 20 17:17:40 crc kubenswrapper[4803]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 17:17:40 crc kubenswrapper[4803]: --upstream=http://127.0.0.1:29108/ \ Mar 20 17:17:40 crc kubenswrapper[4803]: --tls-private-key-file=${TLS_PK} \ Mar 20 17:17:40 crc kubenswrapper[4803]: --tls-cert-file=${TLS_CERT} Mar 20 17:17:40 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nsx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q5phd_openshift-ovn-kubernetes(332635bf-724e-479c-86bc-d08bb83cd6ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:40 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.189399 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:40 crc kubenswrapper[4803]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:40 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:40 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: Mar 20 17:17:40 crc kubenswrapper[4803]: ovn_v4_join_subnet_opt= Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: ovn_v6_join_subnet_opt= Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: Mar 20 17:17:40 crc kubenswrapper[4803]: ovn_v4_transit_switch_subnet_opt= Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: ovn_v6_transit_switch_subnet_opt= Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: Mar 20 17:17:40 crc kubenswrapper[4803]: dns_name_resolver_enabled_flag= Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ "false" == "true" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: Mar 20 17:17:40 crc kubenswrapper[4803]: persistent_ips_enabled_flag= Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ "true" == "true" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: Mar 20 17:17:40 crc kubenswrapper[4803]: # This is needed so that converting clusters from GA to TP Mar 20 17:17:40 crc kubenswrapper[4803]: # will rollout control plane pods as well Mar 20 17:17:40 crc kubenswrapper[4803]: network_segmentation_enabled_flag= Mar 20 17:17:40 crc kubenswrapper[4803]: multi_network_enabled_flag= Mar 20 17:17:40 crc kubenswrapper[4803]: if [[ "true" == "true" ]]; then Mar 20 17:17:40 crc kubenswrapper[4803]: multi_network_enabled_flag="--enable-multi-network" Mar 20 17:17:40 crc kubenswrapper[4803]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 17:17:40 crc kubenswrapper[4803]: fi Mar 20 17:17:40 crc kubenswrapper[4803]: Mar 20 17:17:40 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 17:17:40 crc kubenswrapper[4803]: exec /usr/bin/ovnkube \ Mar 20 17:17:40 crc kubenswrapper[4803]: --enable-interconnect \ Mar 20 17:17:40 crc kubenswrapper[4803]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 17:17:40 crc kubenswrapper[4803]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 17:17:40 crc kubenswrapper[4803]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 17:17:40 crc kubenswrapper[4803]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 17:17:40 crc kubenswrapper[4803]: --metrics-enable-pprof \ Mar 20 17:17:40 crc kubenswrapper[4803]: --metrics-enable-config-duration \ Mar 20 17:17:40 crc kubenswrapper[4803]: ${ovn_v4_join_subnet_opt} \ Mar 20 17:17:40 crc kubenswrapper[4803]: ${ovn_v6_join_subnet_opt} \ Mar 20 17:17:40 crc kubenswrapper[4803]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 17:17:40 crc kubenswrapper[4803]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 17:17:40 crc kubenswrapper[4803]: ${dns_name_resolver_enabled_flag} \ Mar 20 17:17:40 crc kubenswrapper[4803]: ${persistent_ips_enabled_flag} \ Mar 20 17:17:40 crc kubenswrapper[4803]: ${multi_network_enabled_flag} \ Mar 20 17:17:40 crc kubenswrapper[4803]: ${network_segmentation_enabled_flag} Mar 20 17:17:40 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nsx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q5phd_openshift-ovn-kubernetes(332635bf-724e-479c-86bc-d08bb83cd6ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:40 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.190604 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" podUID="332635bf-724e-479c-86bc-d08bb83cd6ef" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.218233 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.256340 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.301827 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.334398 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.375952 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.419729 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.434332 4803 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.436147 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.436247 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.436277 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.436865 4803 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.453324 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.508706 4803 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.508820 4803 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.510042 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.510107 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.510133 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.510167 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.510191 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.530444 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.536359 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.536880 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.536995 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.537133 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.537239 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.537331 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.552310 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.557103 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.557165 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.557185 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.557213 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.557230 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.562119 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.562327 4803 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.562461 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs podName:56b68c7b-2d4d-4628-9f1f-85ec48141f82 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:41.562426113 +0000 UTC m=+71.474018223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs") pod "network-metrics-daemon-llxn2" (UID: "56b68c7b-2d4d-4628-9f1f-85ec48141f82") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.571054 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.575624 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.575768 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.575797 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.575815 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.575827 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.579115 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.591694 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.595178 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.595251 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.595272 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.595294 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.595312 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.608964 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: E0320 17:17:40.609108 4803 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.610639 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.610685 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.610702 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.610721 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.610737 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.617861 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.659256 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.694371 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.713849 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.713894 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.713904 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.713920 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.713933 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.734564 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.774355 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.816689 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.816767 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.816792 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.816823 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.816848 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.816998 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.863278 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.895674 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.919673 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.919740 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.919755 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.919780 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.919801 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:40Z","lastTransitionTime":"2026-03-20T17:17:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.938062 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:40 crc kubenswrapper[4803]: I0320 17:17:40.981008 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.013129 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.022641 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.022689 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.022701 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.022721 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.022735 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.054855 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.099124 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.125155 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.125208 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.125222 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.125240 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.125253 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.139083 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.171140 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.221384 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.229696 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.229760 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.229774 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.229796 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.229811 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.255485 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.294190 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.333260 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.333305 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.333315 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.333335 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.333348 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.339033 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.377044 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.434027 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.435622 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.435650 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.435660 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.435676 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.435688 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.538075 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.538124 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.538132 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.538147 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.538158 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.571949 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.572185 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572224 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:17:45.572185826 +0000 UTC m=+75.483777936 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572321 4803 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.572366 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572424 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:45.572395494 +0000 UTC m=+75.483987654 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.572472 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572575 4803 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.572587 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572644 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:45.572629381 +0000 UTC m=+75.484221481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572685 4803 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572757 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs podName:56b68c7b-2d4d-4628-9f1f-85ec48141f82 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:43.572737685 +0000 UTC m=+73.484329765 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs") pod "network-metrics-daemon-llxn2" (UID: "56b68c7b-2d4d-4628-9f1f-85ec48141f82") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572887 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572935 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.572955 4803 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.573040 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:45.573012974 +0000 UTC m=+75.484605044 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.641101 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.641142 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.641159 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.641183 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.641201 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.674124 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.674309 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.674335 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.674354 4803 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.674441 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:45.674420381 +0000 UTC m=+75.586012491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.744082 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.744114 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.744131 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.744175 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.744193 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847076 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847123 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.847206 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.847328 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847441 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847488 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847458 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847618 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.847643 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847507 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847774 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.847788 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:41 crc kubenswrapper[4803]: E0320 17:17:41.847788 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.950097 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.950139 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.950150 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.950166 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:41 crc kubenswrapper[4803]: I0320 17:17:41.950178 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:41Z","lastTransitionTime":"2026-03-20T17:17:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.053948 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.054025 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.054049 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.054085 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.054104 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.158008 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.158105 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.158129 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.158155 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.158173 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.261546 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.261594 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.261626 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.261648 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.261659 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.365315 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.365381 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.365401 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.365427 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.365451 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.468725 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.469054 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.469205 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.469419 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.469612 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.573003 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.573086 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.573105 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.573138 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.573164 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.675802 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.675833 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.675840 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.675854 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.675863 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.778606 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.778676 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.778694 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.778721 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.778739 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.881787 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.881843 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.881855 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.881872 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.881886 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.984599 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.984643 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.984653 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.984668 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:42 crc kubenswrapper[4803]: I0320 17:17:42.984679 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:42Z","lastTransitionTime":"2026-03-20T17:17:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.087898 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.087945 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.087962 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.087984 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.088001 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.191371 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.191426 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.191446 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.191474 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.191496 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.294361 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.296036 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.296240 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.296385 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.296507 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.399826 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.400127 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.400487 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.400693 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.400828 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.504281 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.504344 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.504364 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.504392 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.504409 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.595448 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:43 crc kubenswrapper[4803]: E0320 17:17:43.595738 4803 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:43 crc kubenswrapper[4803]: E0320 17:17:43.595907 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs podName:56b68c7b-2d4d-4628-9f1f-85ec48141f82 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:47.595871917 +0000 UTC m=+77.507463987 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs") pod "network-metrics-daemon-llxn2" (UID: "56b68c7b-2d4d-4628-9f1f-85ec48141f82") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.607058 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.607101 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.607114 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.607135 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.607149 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.709469 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.709562 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.709578 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.709623 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.709636 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.812327 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.812385 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.812402 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.812428 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.812446 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.847234 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.847287 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:43 crc kubenswrapper[4803]: E0320 17:17:43.847351 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.847624 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:43 crc kubenswrapper[4803]: E0320 17:17:43.847611 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:43 crc kubenswrapper[4803]: E0320 17:17:43.847937 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.848063 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:43 crc kubenswrapper[4803]: E0320 17:17:43.848266 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.915538 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.915597 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.915608 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.915631 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:43 crc kubenswrapper[4803]: I0320 17:17:43.915644 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:43Z","lastTransitionTime":"2026-03-20T17:17:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.018975 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.019026 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.019042 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.019062 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.019076 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.122081 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.122118 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.122130 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.122178 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.122197 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.224590 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.224637 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.224653 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.224683 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.224705 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.327259 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.327333 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.327355 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.327388 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.327409 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.430029 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.430087 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.430105 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.430129 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.430149 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.532558 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.532622 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.532638 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.533012 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.533105 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.636383 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.636440 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.636480 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.636510 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.636560 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.739918 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.739985 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.740004 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.740028 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.740045 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.843116 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.843179 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.843197 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.843223 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.843242 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.946198 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.946259 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.946276 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.946303 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:44 crc kubenswrapper[4803]: I0320 17:17:44.946326 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:44Z","lastTransitionTime":"2026-03-20T17:17:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.048968 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.049099 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.049120 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.049145 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.049162 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.152519 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.152611 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.152630 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.152654 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.152673 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.256932 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.256985 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.257003 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.257026 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.257043 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.359682 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.359752 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.359780 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.359811 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.359831 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.462685 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.463228 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.463407 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.463730 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.463874 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.567019 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.567547 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.567690 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.567803 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.568023 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.623229 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.623379 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.623441 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.623502 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.623720 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.623747 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.623768 4803 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.623832 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:53.623812469 +0000 UTC m=+83.535404579 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.624311 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:17:53.624285795 +0000 UTC m=+83.535877895 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.624421 4803 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.624555 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:53.624509293 +0000 UTC m=+83.536101453 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.625145 4803 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.625367 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:53.625331311 +0000 UTC m=+83.536923581 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.670935 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.671017 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.671040 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.671067 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.671086 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.724596 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.724899 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.724927 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.724947 4803 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.725014 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:53.724992219 +0000 UTC m=+83.636584319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.773516 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.774089 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.774285 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.774434 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.774591 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.847599 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.847763 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.848020 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.847787 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.848325 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.848371 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.849872 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:45 crc kubenswrapper[4803]: E0320 17:17:45.849984 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.877875 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.877939 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.877953 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.877990 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.878004 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.980932 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.981018 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.981042 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.981070 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:45 crc kubenswrapper[4803]: I0320 17:17:45.981089 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:45Z","lastTransitionTime":"2026-03-20T17:17:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.083926 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.084009 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.084037 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.084061 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.084078 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.186848 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.186917 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.186943 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.186966 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.186982 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.289877 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.289943 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.289963 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.289988 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.290012 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.393279 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.393341 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.393360 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.393384 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.393402 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.497096 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.497252 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.497281 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.497308 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.497326 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.599988 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.600064 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.600088 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.600113 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.600131 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.703239 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.703297 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.703313 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.703336 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.703353 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.806738 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.806816 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.806836 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.806861 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.806881 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.909482 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.909592 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.909618 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.909645 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:46 crc kubenswrapper[4803]: I0320 17:17:46.909665 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:46Z","lastTransitionTime":"2026-03-20T17:17:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.012651 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.012725 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.012752 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.012784 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.012806 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.116127 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.116182 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.116203 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.116230 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.116253 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.218437 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.218506 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.218563 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.218594 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.218615 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.321584 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.321646 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.321664 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.321691 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.321708 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.424623 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.424681 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.424704 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.424732 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.424754 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.527330 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.527398 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.527415 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.527440 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.527463 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.630606 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.630701 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.630731 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.630773 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.630802 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.647942 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:47 crc kubenswrapper[4803]: E0320 17:17:47.648123 4803 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:47 crc kubenswrapper[4803]: E0320 17:17:47.648198 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs podName:56b68c7b-2d4d-4628-9f1f-85ec48141f82 nodeName:}" failed. No retries permitted until 2026-03-20 17:17:55.648176592 +0000 UTC m=+85.559768692 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs") pod "network-metrics-daemon-llxn2" (UID: "56b68c7b-2d4d-4628-9f1f-85ec48141f82") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.735343 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.735432 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.735471 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.735504 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.735550 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.839143 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.839200 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.839219 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.839245 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.839264 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.847992 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.848031 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.848039 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.848111 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:47 crc kubenswrapper[4803]: E0320 17:17:47.848309 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:47 crc kubenswrapper[4803]: E0320 17:17:47.848454 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:47 crc kubenswrapper[4803]: E0320 17:17:47.848643 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:47 crc kubenswrapper[4803]: E0320 17:17:47.848823 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.942852 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.942918 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.942954 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.942984 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:47 crc kubenswrapper[4803]: I0320 17:17:47.943005 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:47Z","lastTransitionTime":"2026-03-20T17:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.046184 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.046243 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.046263 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.046289 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.046308 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.149032 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.149134 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.149151 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.149169 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.149437 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.253921 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.253968 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.253980 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.253997 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.254008 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.356943 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.357017 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.357041 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.357186 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.357211 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.461053 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.461121 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.461143 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.461166 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.461184 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.565121 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.565220 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.565241 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.565275 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.565299 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.668778 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.668858 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.668881 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.668912 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.668936 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.772830 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.772918 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.772946 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.772980 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.773004 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.876513 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.876634 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.876659 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.876692 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.876712 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.981694 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.981772 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.981797 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.981829 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:48 crc kubenswrapper[4803]: I0320 17:17:48.981853 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:48Z","lastTransitionTime":"2026-03-20T17:17:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.085194 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.085586 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.085741 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.085967 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.086349 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.189866 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.189914 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.189931 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.190041 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.190056 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.292549 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.292593 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.292601 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.292622 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.292633 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.395962 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.396051 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.396072 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.396106 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.396133 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.500047 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.500109 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.500127 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.500151 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.500169 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.605493 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.605593 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.605617 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.605683 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.605705 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.710102 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.710175 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.710194 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.710221 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.710240 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.813834 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.814240 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.814436 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.814738 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.814962 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.848085 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:49 crc kubenswrapper[4803]: E0320 17:17:49.848301 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.848436 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.848658 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:49 crc kubenswrapper[4803]: E0320 17:17:49.849017 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.849162 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:49 crc kubenswrapper[4803]: E0320 17:17:49.849291 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:49 crc kubenswrapper[4803]: E0320 17:17:49.849412 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:49 crc kubenswrapper[4803]: E0320 17:17:49.852231 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:49 crc kubenswrapper[4803]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:49 crc kubenswrapper[4803]: set -uo pipefail Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 17:17:49 crc kubenswrapper[4803]: HOSTS_FILE="/etc/hosts" Mar 20 17:17:49 crc kubenswrapper[4803]: TEMP_FILE="/etc/hosts.tmp" Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: # Make a temporary file with the old hosts file's attributes. Mar 20 17:17:49 crc kubenswrapper[4803]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 17:17:49 crc kubenswrapper[4803]: echo "Failed to preserve hosts file. Exiting." Mar 20 17:17:49 crc kubenswrapper[4803]: exit 1 Mar 20 17:17:49 crc kubenswrapper[4803]: fi Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: while true; do Mar 20 17:17:49 crc kubenswrapper[4803]: declare -A svc_ips Mar 20 17:17:49 crc kubenswrapper[4803]: for svc in "${services[@]}"; do Mar 20 17:17:49 crc kubenswrapper[4803]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 17:17:49 crc kubenswrapper[4803]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 17:17:49 crc kubenswrapper[4803]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 17:17:49 crc kubenswrapper[4803]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 17:17:49 crc kubenswrapper[4803]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:49 crc kubenswrapper[4803]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:49 crc kubenswrapper[4803]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 17:17:49 crc kubenswrapper[4803]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 17:17:49 crc kubenswrapper[4803]: for i in ${!cmds[*]} Mar 20 17:17:49 crc kubenswrapper[4803]: do Mar 20 17:17:49 crc kubenswrapper[4803]: ips=($(eval "${cmds[i]}")) Mar 20 17:17:49 crc kubenswrapper[4803]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 17:17:49 crc kubenswrapper[4803]: svc_ips["${svc}"]="${ips[@]}" Mar 20 17:17:49 crc kubenswrapper[4803]: break Mar 20 17:17:49 crc kubenswrapper[4803]: fi Mar 20 17:17:49 crc kubenswrapper[4803]: done Mar 20 17:17:49 crc kubenswrapper[4803]: done Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: # Update /etc/hosts only if we get valid service IPs Mar 20 17:17:49 crc kubenswrapper[4803]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 17:17:49 crc kubenswrapper[4803]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 17:17:49 crc kubenswrapper[4803]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 17:17:49 crc kubenswrapper[4803]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 17:17:49 crc kubenswrapper[4803]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 17:17:49 crc kubenswrapper[4803]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 17:17:49 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:49 crc kubenswrapper[4803]: continue Mar 20 17:17:49 crc kubenswrapper[4803]: fi Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: # Append resolver entries for services Mar 20 17:17:49 crc kubenswrapper[4803]: rc=0 Mar 20 17:17:49 crc kubenswrapper[4803]: for svc in "${!svc_ips[@]}"; do Mar 20 17:17:49 crc kubenswrapper[4803]: for ip in ${svc_ips[${svc}]}; do Mar 20 17:17:49 crc kubenswrapper[4803]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 17:17:49 crc kubenswrapper[4803]: done Mar 20 17:17:49 crc kubenswrapper[4803]: done Mar 20 17:17:49 crc kubenswrapper[4803]: if [[ $rc -ne 0 ]]; then Mar 20 17:17:49 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:49 crc kubenswrapper[4803]: continue Mar 20 17:17:49 crc kubenswrapper[4803]: fi Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: Mar 20 17:17:49 crc kubenswrapper[4803]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 17:17:49 crc kubenswrapper[4803]: # Replace /etc/hosts with our modified version if needed Mar 20 17:17:49 crc kubenswrapper[4803]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 17:17:49 crc kubenswrapper[4803]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 17:17:49 crc kubenswrapper[4803]: fi Mar 20 17:17:49 crc kubenswrapper[4803]: sleep 60 & wait Mar 20 17:17:49 crc kubenswrapper[4803]: unset svc_ips Mar 20 17:17:49 crc kubenswrapper[4803]: done Mar 20 17:17:49 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6p2gx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zp898_openshift-dns(7a26ad31-dca7-4b95-80a8-d8a3db949d1a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:49 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:49 crc kubenswrapper[4803]: E0320 17:17:49.852929 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:49 crc kubenswrapper[4803]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:49 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:49 crc kubenswrapper[4803]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 17:17:49 crc kubenswrapper[4803]: source /etc/kubernetes/apiserver-url.env Mar 20 17:17:49 crc kubenswrapper[4803]: else Mar 20 17:17:49 crc kubenswrapper[4803]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 17:17:49 crc kubenswrapper[4803]: exit 1 Mar 20 17:17:49 crc kubenswrapper[4803]: fi Mar 20 17:17:49 crc kubenswrapper[4803]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 17:17:49 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:49 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:49 crc kubenswrapper[4803]: E0320 17:17:49.853747 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zp898" podUID="7a26ad31-dca7-4b95-80a8-d8a3db949d1a" Mar 20 17:17:49 crc kubenswrapper[4803]: E0320 17:17:49.854252 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.918425 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.918512 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.918553 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.918582 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:49 crc kubenswrapper[4803]: I0320 17:17:49.918604 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:49Z","lastTransitionTime":"2026-03-20T17:17:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.022738 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.022799 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.022819 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.022853 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.022874 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.125943 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.126028 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.126042 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.126067 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.126081 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.229013 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.229088 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.229110 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.229134 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.229156 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.332299 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.332360 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.332371 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.332402 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.332417 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.436096 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.436145 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.436154 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.436172 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.436181 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.539652 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.540171 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.540321 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.540471 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.540651 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.627189 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.627419 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.627591 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.627737 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.627894 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.646172 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.653038 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.653109 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.653128 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.653155 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.653173 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.676661 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.682796 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.682877 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.682904 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.682941 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.682962 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.702594 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.708586 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.708655 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.708675 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.708708 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.708728 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.727654 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.733903 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.733978 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.734006 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.734042 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.734067 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.752397 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.752655 4803 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.755108 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.755158 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.755177 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.755203 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.755227 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.850232 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:50 crc kubenswrapper[4803]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:50 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:50 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:50 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:50 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:50 crc kubenswrapper[4803]: fi Mar 20 17:17:50 crc kubenswrapper[4803]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 17:17:50 crc kubenswrapper[4803]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 17:17:50 crc kubenswrapper[4803]: ho_enable="--enable-hybrid-overlay" Mar 20 17:17:50 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 17:17:50 crc kubenswrapper[4803]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 17:17:50 crc kubenswrapper[4803]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 17:17:50 crc kubenswrapper[4803]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:17:50 crc kubenswrapper[4803]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 17:17:50 crc kubenswrapper[4803]: --webhook-host=127.0.0.1 \ Mar 20 17:17:50 crc kubenswrapper[4803]: --webhook-port=9743 \ Mar 20 17:17:50 crc kubenswrapper[4803]: ${ho_enable} \ Mar 20 17:17:50 crc kubenswrapper[4803]: --enable-interconnect \ Mar 20 17:17:50 crc kubenswrapper[4803]: --disable-approver \ Mar 20 17:17:50 crc kubenswrapper[4803]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 17:17:50 crc kubenswrapper[4803]: --wait-for-kubernetes-api=200s \ Mar 20 17:17:50 crc kubenswrapper[4803]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 17:17:50 crc kubenswrapper[4803]: --loglevel="${LOGLEVEL}" Mar 20 17:17:50 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:50 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.853953 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:50 crc kubenswrapper[4803]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:50 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:50 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:50 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:50 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:50 crc kubenswrapper[4803]: fi Mar 20 17:17:50 crc kubenswrapper[4803]: Mar 20 17:17:50 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 17:17:50 crc kubenswrapper[4803]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 17:17:50 crc kubenswrapper[4803]: --disable-webhook \ Mar 20 17:17:50 crc kubenswrapper[4803]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 17:17:50 crc kubenswrapper[4803]: --loglevel="${LOGLEVEL}" Mar 20 17:17:50 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:50 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:50 crc kubenswrapper[4803]: E0320 17:17:50.856599 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.858846 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.858902 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.858921 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.858944 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.858965 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.867625 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.887875 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.916181 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.930241 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.950750 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.962103 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.962179 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.962201 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.962239 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.962268 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:50Z","lastTransitionTime":"2026-03-20T17:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.964077 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.980809 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:50 crc kubenswrapper[4803]: I0320 17:17:50.996509 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.014701 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.022173 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.032006 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.050675 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.064141 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.065310 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.065578 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.065722 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.065856 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.066029 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.079901 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.095350 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.110019 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.129497 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.149103 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.169328 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.169390 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.169407 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.169437 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.169456 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.171453 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.183765 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.200181 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.224970 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.236116 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.251198 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.270726 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.273225 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.273286 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.273311 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.273343 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.273630 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.289115 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.303317 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.317975 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.335861 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.350027 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.363967 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.376918 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.376991 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.377010 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.377040 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.377060 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.474264 4803 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.480461 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.480507 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.480557 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.480583 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.480599 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.583591 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.583639 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.583651 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.583672 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.583686 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.686463 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.686498 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.686507 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.686540 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.686552 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.789416 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.789486 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.789510 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.789613 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.789637 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.847670 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.847754 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.847697 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.847855 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.847882 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.848299 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.849183 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.849323 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.850936 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjqbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.851303 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:51 crc kubenswrapper[4803]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 17:17:51 crc kubenswrapper[4803]: while [ true ]; Mar 20 17:17:51 crc kubenswrapper[4803]: do Mar 20 17:17:51 crc kubenswrapper[4803]: for f in $(ls /tmp/serviceca); do Mar 20 17:17:51 crc kubenswrapper[4803]: echo $f Mar 20 17:17:51 crc kubenswrapper[4803]: ca_file_path="/tmp/serviceca/${f}" Mar 20 17:17:51 crc kubenswrapper[4803]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 17:17:51 crc kubenswrapper[4803]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 17:17:51 crc kubenswrapper[4803]: if [ -e "${reg_dir_path}" ]; then Mar 20 17:17:51 crc kubenswrapper[4803]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 17:17:51 crc kubenswrapper[4803]: else Mar 20 17:17:51 crc kubenswrapper[4803]: mkdir $reg_dir_path Mar 20 17:17:51 crc kubenswrapper[4803]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 17:17:51 crc kubenswrapper[4803]: fi Mar 20 17:17:51 crc kubenswrapper[4803]: done Mar 20 17:17:51 crc kubenswrapper[4803]: for d in $(ls /etc/docker/certs.d); do Mar 20 17:17:51 crc kubenswrapper[4803]: echo $d Mar 20 17:17:51 crc kubenswrapper[4803]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 17:17:51 crc kubenswrapper[4803]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 17:17:51 crc kubenswrapper[4803]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 17:17:51 crc kubenswrapper[4803]: rm -rf /etc/docker/certs.d/$d Mar 20 17:17:51 crc kubenswrapper[4803]: fi Mar 20 17:17:51 crc kubenswrapper[4803]: done Mar 20 17:17:51 crc kubenswrapper[4803]: sleep 60 & wait ${!} Mar 20 17:17:51 crc kubenswrapper[4803]: done Mar 20 17:17:51 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qrlw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-l5swf_openshift-image-registry(ec2c9586-ac9f-467a-a353-e43ac2a99797): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:51 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.852403 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-l5swf" podUID="ec2c9586-ac9f-467a-a353-e43ac2a99797" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.854677 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjqbr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:51 crc kubenswrapper[4803]: E0320 17:17:51.855841 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.893239 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.893289 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.893304 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.893328 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.893345 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.996904 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.997224 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.997370 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.997598 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:51 crc kubenswrapper[4803]: I0320 17:17:51.997747 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:51Z","lastTransitionTime":"2026-03-20T17:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.101218 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.101263 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.101274 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.101295 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.101307 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.218168 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.218248 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.218272 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.218302 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.218325 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.321353 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.321410 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.321429 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.321453 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.321473 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.424492 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.424599 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.424617 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.424643 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.424662 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.527796 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.527870 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.527894 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.527923 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.527944 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.631406 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.631479 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.631508 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.631583 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.631611 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.735124 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.735199 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.735222 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.735251 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.735279 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.839068 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.839119 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.839135 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.839157 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.839175 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:52 crc kubenswrapper[4803]: E0320 17:17:52.850684 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbtw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-56rll_openshift-multus(dad4e80b-88f1-4e64-a7e6-136c1d3b6e67): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 17:17:52 crc kubenswrapper[4803]: E0320 17:17:52.852456 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-56rll" podUID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" Mar 20 17:17:52 crc kubenswrapper[4803]: E0320 17:17:52.857206 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:52 crc kubenswrapper[4803]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 17:17:52 crc kubenswrapper[4803]: set -euo pipefail Mar 20 17:17:52 crc kubenswrapper[4803]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 17:17:52 crc kubenswrapper[4803]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 17:17:52 crc kubenswrapper[4803]: # As the secret mount is optional we must wait for the files to be present. Mar 20 17:17:52 crc kubenswrapper[4803]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 17:17:52 crc kubenswrapper[4803]: TS=$(date +%s) Mar 20 17:17:52 crc kubenswrapper[4803]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 17:17:52 crc kubenswrapper[4803]: HAS_LOGGED_INFO=0 Mar 20 17:17:52 crc kubenswrapper[4803]: Mar 20 17:17:52 crc kubenswrapper[4803]: log_missing_certs(){ Mar 20 17:17:52 crc kubenswrapper[4803]: CUR_TS=$(date +%s) Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 17:17:52 crc kubenswrapper[4803]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 17:17:52 crc kubenswrapper[4803]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 17:17:52 crc kubenswrapper[4803]: HAS_LOGGED_INFO=1 Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: } Mar 20 17:17:52 crc kubenswrapper[4803]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 17:17:52 crc kubenswrapper[4803]: log_missing_certs Mar 20 17:17:52 crc kubenswrapper[4803]: sleep 5 Mar 20 17:17:52 crc kubenswrapper[4803]: done Mar 20 17:17:52 crc kubenswrapper[4803]: Mar 20 17:17:52 crc kubenswrapper[4803]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 17:17:52 crc kubenswrapper[4803]: exec /usr/bin/kube-rbac-proxy \ Mar 20 17:17:52 crc kubenswrapper[4803]: --logtostderr \ Mar 20 17:17:52 crc kubenswrapper[4803]: --secure-listen-address=:9108 \ Mar 20 17:17:52 crc kubenswrapper[4803]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 17:17:52 crc kubenswrapper[4803]: --upstream=http://127.0.0.1:29108/ \ Mar 20 17:17:52 crc kubenswrapper[4803]: --tls-private-key-file=${TLS_PK} \ Mar 20 17:17:52 crc kubenswrapper[4803]: --tls-cert-file=${TLS_CERT} Mar 20 17:17:52 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nsx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q5phd_openshift-ovn-kubernetes(332635bf-724e-479c-86bc-d08bb83cd6ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:52 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:52 crc kubenswrapper[4803]: E0320 17:17:52.865928 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:17:52 crc kubenswrapper[4803]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ -f "/env/_master" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: set -o allexport Mar 20 17:17:52 crc kubenswrapper[4803]: source "/env/_master" Mar 20 17:17:52 crc kubenswrapper[4803]: set +o allexport Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: Mar 20 17:17:52 crc kubenswrapper[4803]: ovn_v4_join_subnet_opt= Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: ovn_v6_join_subnet_opt= Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: Mar 20 17:17:52 crc kubenswrapper[4803]: ovn_v4_transit_switch_subnet_opt= Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: ovn_v6_transit_switch_subnet_opt= Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ "" != "" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: Mar 20 17:17:52 crc kubenswrapper[4803]: dns_name_resolver_enabled_flag= Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ "false" == "true" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: Mar 20 17:17:52 crc kubenswrapper[4803]: persistent_ips_enabled_flag= Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ "true" == "true" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: Mar 20 17:17:52 crc kubenswrapper[4803]: # This is needed so that converting clusters from GA to TP Mar 20 17:17:52 crc kubenswrapper[4803]: # will rollout control plane pods as well Mar 20 17:17:52 crc kubenswrapper[4803]: network_segmentation_enabled_flag= Mar 20 17:17:52 crc kubenswrapper[4803]: multi_network_enabled_flag= Mar 20 17:17:52 crc kubenswrapper[4803]: if [[ "true" == "true" ]]; then Mar 20 17:17:52 crc kubenswrapper[4803]: multi_network_enabled_flag="--enable-multi-network" Mar 20 17:17:52 crc kubenswrapper[4803]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 17:17:52 crc kubenswrapper[4803]: fi Mar 20 17:17:52 crc kubenswrapper[4803]: Mar 20 17:17:52 crc kubenswrapper[4803]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 17:17:52 crc kubenswrapper[4803]: exec /usr/bin/ovnkube \ Mar 20 17:17:52 crc kubenswrapper[4803]: --enable-interconnect \ Mar 20 17:17:52 crc kubenswrapper[4803]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 17:17:52 crc kubenswrapper[4803]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 17:17:52 crc kubenswrapper[4803]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 17:17:52 crc kubenswrapper[4803]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 17:17:52 crc kubenswrapper[4803]: --metrics-enable-pprof \ Mar 20 17:17:52 crc kubenswrapper[4803]: --metrics-enable-config-duration \ Mar 20 17:17:52 crc kubenswrapper[4803]: ${ovn_v4_join_subnet_opt} \ Mar 20 17:17:52 crc kubenswrapper[4803]: ${ovn_v6_join_subnet_opt} \ Mar 20 17:17:52 crc kubenswrapper[4803]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 17:17:52 crc kubenswrapper[4803]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 17:17:52 crc kubenswrapper[4803]: ${dns_name_resolver_enabled_flag} \ Mar 20 17:17:52 crc kubenswrapper[4803]: ${persistent_ips_enabled_flag} \ Mar 20 17:17:52 crc kubenswrapper[4803]: ${multi_network_enabled_flag} \ Mar 20 17:17:52 crc kubenswrapper[4803]: ${network_segmentation_enabled_flag} Mar 20 17:17:52 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6nsx6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-q5phd_openshift-ovn-kubernetes(332635bf-724e-479c-86bc-d08bb83cd6ef): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 17:17:52 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:17:52 crc kubenswrapper[4803]: E0320 17:17:52.867267 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" podUID="332635bf-724e-479c-86bc-d08bb83cd6ef" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.942553 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.942627 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.942649 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.942678 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:52 crc kubenswrapper[4803]: I0320 17:17:52.942701 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:52Z","lastTransitionTime":"2026-03-20T17:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.046287 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.046349 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.046366 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.046392 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.046412 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.149571 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.149620 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.149636 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.149658 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.149675 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.252736 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.252853 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.252887 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.252918 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.252945 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.356559 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.356641 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.356669 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.356701 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.356724 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.460497 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.460600 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.460618 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.460644 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.460662 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.563989 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.564070 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.564089 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.564119 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.564150 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.611733 4803 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.667100 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.667191 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.667217 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.667250 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.667276 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.713970 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.714140 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.714218 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714278 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:09.714240552 +0000 UTC m=+99.625832663 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714323 4803 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.714357 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714393 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:09.714370987 +0000 UTC m=+99.625963097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714418 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714452 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714474 4803 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714585 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:09.714563673 +0000 UTC m=+99.626155773 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714583 4803 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.714694 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:09.714674227 +0000 UTC m=+99.626266337 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.770870 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.770958 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.770977 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.771005 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.771023 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.815245 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.815496 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.815583 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.815608 4803 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.815700 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:09.81567497 +0000 UTC m=+99.727267070 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.847053 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.847069 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.847229 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.847285 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.847300 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.848006 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.849101 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:53 crc kubenswrapper[4803]: E0320 17:17:53.849310 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.876598 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.877362 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.877385 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.877469 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.877495 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.979644 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.979697 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.979712 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.979731 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:53 crc kubenswrapper[4803]: I0320 17:17:53.979743 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:53Z","lastTransitionTime":"2026-03-20T17:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.082781 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.082846 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.082862 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.082890 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.082908 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.185723 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.185787 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.185806 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.185833 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.185851 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.231483 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe" exitCode=0 Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.231562 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.233682 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d8jn6" event={"ID":"55c909c3-a57a-4440-9052-48718b1d2dfd","Type":"ContainerStarted","Data":"c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.253667 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.272639 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.288723 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.288751 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.289279 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.289298 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.289308 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.298896 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.310975 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.327506 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.349630 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.364946 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.374773 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.386869 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.392189 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.392222 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.392231 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.392267 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.392278 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.402595 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.415756 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.434807 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.445007 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.459270 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.472195 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.491106 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.499006 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.499265 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.499292 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.499329 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.499353 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.504549 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.514626 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.528339 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.541135 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.558064 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.587751 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.600505 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.602735 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.602790 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.602808 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.602886 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.602904 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.615939 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.629086 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.644932 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.664092 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.676980 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.694920 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.705425 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.705502 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.705570 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.705607 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.705648 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.711954 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.808919 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.809655 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.809840 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.810076 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.810280 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.914013 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.914062 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.914077 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.914097 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:54 crc kubenswrapper[4803]: I0320 17:17:54.914112 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:54Z","lastTransitionTime":"2026-03-20T17:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.015968 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.016000 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.016008 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.016022 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.016031 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.118272 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.118331 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.118349 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.118369 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.118381 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.221498 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.221578 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.221592 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.221611 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.221623 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.240467 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.240581 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.240611 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.240653 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.240671 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.240688 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.325841 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.325909 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.325932 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.325963 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.325986 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.429120 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.429195 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.429212 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.429237 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.429254 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.532733 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.532793 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.532812 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.532835 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.532855 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.634862 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.634903 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.634915 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.634933 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.634946 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.738176 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.738241 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.738334 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.738369 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.738393 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.741207 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:55 crc kubenswrapper[4803]: E0320 17:17:55.741365 4803 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:55 crc kubenswrapper[4803]: E0320 17:17:55.741460 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs podName:56b68c7b-2d4d-4628-9f1f-85ec48141f82 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.74143412 +0000 UTC m=+101.653026220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs") pod "network-metrics-daemon-llxn2" (UID: "56b68c7b-2d4d-4628-9f1f-85ec48141f82") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.841481 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.841582 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.841603 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.841630 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.841650 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.847021 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.847059 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.847109 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.847046 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:55 crc kubenswrapper[4803]: E0320 17:17:55.847211 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:55 crc kubenswrapper[4803]: E0320 17:17:55.847334 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:55 crc kubenswrapper[4803]: E0320 17:17:55.847467 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:55 crc kubenswrapper[4803]: E0320 17:17:55.847662 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.945106 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.945176 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.945198 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.945228 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:55 crc kubenswrapper[4803]: I0320 17:17:55.945253 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:55Z","lastTransitionTime":"2026-03-20T17:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.047541 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.047586 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.047596 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.047611 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.047621 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.150315 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.150364 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.150381 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.150403 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.150420 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.252998 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.253137 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.253156 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.253183 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.253200 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.356593 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.356660 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.356686 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.356715 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.356734 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.460391 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.460483 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.460566 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.460615 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.460652 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.563834 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.563898 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.563918 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.563944 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.563960 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.666932 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.666998 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.667017 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.667043 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.667061 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.770758 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.770889 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.770917 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.770951 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.770978 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.873894 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.873933 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.873943 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.873956 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.873967 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.976536 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.976581 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.976593 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.976613 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:56 crc kubenswrapper[4803]: I0320 17:17:56.976627 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:56Z","lastTransitionTime":"2026-03-20T17:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.080276 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.080317 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.080327 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.080345 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.080355 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.183704 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.183768 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.183789 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.183817 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.183838 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.287198 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.287586 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.287694 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.287777 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.287853 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.391760 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.392247 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.392412 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.392592 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.392743 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.496703 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.496759 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.496776 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.496800 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.496818 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.599343 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.599391 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.599407 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.599430 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.599448 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.702217 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.702266 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.702282 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.702304 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.702322 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.805470 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.805573 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.805601 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.805627 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.805645 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.847202 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.847281 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:57 crc kubenswrapper[4803]: E0320 17:17:57.847387 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.847398 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.847231 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:57 crc kubenswrapper[4803]: E0320 17:17:57.847507 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:57 crc kubenswrapper[4803]: E0320 17:17:57.847715 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:57 crc kubenswrapper[4803]: E0320 17:17:57.847817 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.908242 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.908330 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.908355 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.908387 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:57 crc kubenswrapper[4803]: I0320 17:17:57.908406 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:57Z","lastTransitionTime":"2026-03-20T17:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.011348 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.011394 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.011412 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.011438 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.011456 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.114448 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.114484 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.114495 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.114513 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.114528 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.217825 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.217891 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.217910 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.217943 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.217964 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.322176 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.322247 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.322260 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.322288 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.322301 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.425268 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.425344 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.425361 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.425892 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.425960 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.529139 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.529244 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.529263 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.529287 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.529305 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.633399 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.633458 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.633476 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.633502 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.633559 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.737394 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.737470 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.737506 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.737570 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.737595 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.839905 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.839973 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.839991 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.840016 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.840035 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.862711 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.943102 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.943178 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.943201 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.943226 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:58 crc kubenswrapper[4803]: I0320 17:17:58.943246 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:58Z","lastTransitionTime":"2026-03-20T17:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.045866 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.045943 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.045966 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.046000 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.046021 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.148716 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.148793 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.148811 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.148836 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.148855 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.250963 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.251008 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.251028 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.251048 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.251060 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.258762 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c38fa1c60e544c93d0222f6483f8f143c733e4fd44a619d3d872aaedf06b08b3"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.265364 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.270936 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.290306 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.299376 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.308739 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a46ef-2f49-471c-993c-f120cf187721\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.319300 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.333009 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.347809 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.359460 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.359510 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.359556 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.359581 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.359602 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.362077 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.368384 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.380206 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38fa1c60e544c93d0222f6483f8f143c733e4fd44a619d3d872aaedf06b08b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.387021 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.394104 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.410619 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.426801 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.448919 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.461807 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.462782 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.462814 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.462824 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.462839 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.462848 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.565270 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.565315 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.565325 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.565337 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.565347 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.669415 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.669466 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.669487 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.669515 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.669570 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.771831 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.771887 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.771928 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.771962 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.771986 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.847372 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:17:59 crc kubenswrapper[4803]: E0320 17:17:59.847557 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.847978 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:17:59 crc kubenswrapper[4803]: E0320 17:17:59.848075 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.848137 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:17:59 crc kubenswrapper[4803]: E0320 17:17:59.848212 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.848266 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:17:59 crc kubenswrapper[4803]: E0320 17:17:59.848336 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.874841 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.874873 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.874889 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.874911 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.874929 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.978470 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.978638 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.978659 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.978682 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:17:59 crc kubenswrapper[4803]: I0320 17:17:59.978700 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:17:59Z","lastTransitionTime":"2026-03-20T17:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.080855 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.080915 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.080935 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.080962 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.080981 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.184089 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.184154 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.184171 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.184196 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.184215 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.287167 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.287233 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.287250 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.287276 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.287293 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.389569 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.389621 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.389639 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.389662 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.389681 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.494191 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.494247 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.494260 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.494281 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.494293 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.596554 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.596796 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.596807 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.596823 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.596834 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.699566 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.699616 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.699626 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.699646 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.699657 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.802689 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.802792 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.802820 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.802850 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.802873 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.860117 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.869461 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.883579 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.892225 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.901993 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.906619 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.906666 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.906678 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.906697 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.906709 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:00Z","lastTransitionTime":"2026-03-20T17:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.914619 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.924938 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.929816 4803 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.935108 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a46ef-2f49-471c-993c-f120cf187721\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.945710 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.958815 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.968379 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.976643 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.985302 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:00 crc kubenswrapper[4803]: I0320 17:18:00.994124 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38fa1c60e544c93d0222f6483f8f143c733e4fd44a619d3d872aaedf06b08b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.002792 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.011009 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.011055 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.011072 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.011098 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.011116 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.021978 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.059343 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.059387 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.059399 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.059417 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.059428 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.070399 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.073777 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.073830 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.073845 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.073862 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.074241 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.085342 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.090431 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.090485 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.090500 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.090526 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.090580 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.101900 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.105683 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.105735 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.105753 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.105778 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.105795 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.119168 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.123904 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.123941 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.123954 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.123972 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.123985 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.137771 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ab993cfe-4b68-4c86-8f13-4224b3fe4fdc\\\",\\\"systemUUID\\\":\\\"f5fad69f-2b85-49f3-8d02-78bb8556dc89\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.137912 4803 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.139620 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.139655 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.139665 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.139708 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.139722 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.242959 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.243029 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.243052 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.243082 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.243104 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.277443 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerStarted","Data":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.277899 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.278058 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.297901 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.305023 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.315400 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.343760 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.345963 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.346122 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.346240 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.346338 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.346429 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.353810 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.368583 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.383946 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.393602 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.404017 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a46ef-2f49-471c-993c-f120cf187721\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.416255 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.426711 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.436405 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.448232 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.449516 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.449574 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.449585 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.449600 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.449610 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.461941 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.483397 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38fa1c60e544c93d0222f6483f8f143c733e4fd44a619d3d872aaedf06b08b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.494859 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.506363 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.519727 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.533819 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.551916 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.551958 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.551972 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.551990 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.552002 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.555792 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.564874 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.576119 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.588719 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.596950 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.604455 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a46ef-2f49-471c-993c-f120cf187721\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.614655 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.625067 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.633879 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.646265 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.655817 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.656028 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.656161 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.656396 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.656607 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.657162 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.665876 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38fa1c60e544c93d0222f6483f8f143c733e4fd44a619d3d872aaedf06b08b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.677560 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.686907 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.760228 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.760305 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.760651 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.760683 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.760704 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.847829 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.848030 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.848593 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.848702 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.849088 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.849464 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.849726 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:01 crc kubenswrapper[4803]: E0320 17:18:01.850009 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.869495 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.869579 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.869596 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.869619 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.869635 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.972679 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.972719 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.972735 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.972757 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:01 crc kubenswrapper[4803]: I0320 17:18:01.972773 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:01Z","lastTransitionTime":"2026-03-20T17:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.076623 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.076668 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.076685 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.076708 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.076724 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.179614 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.180244 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.180334 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.180423 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.180551 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.284344 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.285049 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.285269 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.286012 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.286246 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.287002 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zp898" event={"ID":"7a26ad31-dca7-4b95-80a8-d8a3db949d1a","Type":"ContainerStarted","Data":"a8d837039d69f0c8a2af5005472c0904b25d6d9e4d08f588b32ce75d16dacafa"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.288101 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.320203 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.332686 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.342911 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.357253 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.366873 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.376293 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38fa1c60e544c93d0222f6483f8f143c733e4fd44a619d3d872aaedf06b08b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.388777 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.388920 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.389026 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.389127 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.389287 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.394761 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.410290 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.435170 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.444770 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.458418 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.478641 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.491244 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.492457 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.492501 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.492513 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.492561 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.492579 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.502589 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a46ef-2f49-471c-993c-f120cf187721\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.518893 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.529748 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.545290 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.595356 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.595431 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.595452 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.595480 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.595502 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.697785 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.697851 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.697864 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.697882 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.697896 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.800622 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.800681 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.800705 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.800734 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.800756 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.905810 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.905878 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.905897 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.905926 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:02 crc kubenswrapper[4803]: I0320 17:18:02.905945 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:02Z","lastTransitionTime":"2026-03-20T17:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.007997 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.008058 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.008071 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.008100 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.008118 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.112936 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.112989 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.113002 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.113020 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.113031 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.216009 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.216046 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.216057 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.216074 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.216085 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.293366 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"4e54151febdf484446329938e8c738d1027062bd13ed00d3adf41e37f0e9d3be"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.293432 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"65940b38653f640196377cdf6c52dabac29bf3708a6f546d7affca969f10c203"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.295075 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ce39fc1a0671faebd96f0f1425656967f792e9555efad98e1071da9c9463efa9"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.299744 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l5swf" event={"ID":"ec2c9586-ac9f-467a-a353-e43ac2a99797","Type":"ContainerStarted","Data":"ae24fd87a74d0993e37ef277f11e00adc48c9b07429b5b34802a701171a0d5f7"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.304145 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.311750 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a46ef-2f49-471c-993c-f120cf187721\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.318152 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.318182 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.318193 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.318209 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.318220 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.320465 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.332510 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.340019 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.353758 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38fa1c60e544c93d0222f6483f8f143c733e4fd44a619d3d872aaedf06b08b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.365093 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e54151febdf484446329938e8c738d1027062bd13ed00d3adf41e37f0e9d3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65940b38653f640196377cdf6c52dabac29bf3708a6f546d7affca969f10c203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.374903 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.401580 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.420859 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.420919 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.420940 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.420963 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.420977 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.455222 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.464271 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.476391 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.486313 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.497447 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.516462 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.522798 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.522861 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.522877 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.522901 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.522915 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.527615 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.542902 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce39fc1a0671faebd96f0f1425656967f792e9555efad98e1071da9c9463efa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.563663 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.573601 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae24fd87a74d0993e37ef277f11e00adc48c9b07429b5b34802a701171a0d5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.587135 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.594551 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.607428 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.621149 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.625270 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.625308 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.625320 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.625339 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.625352 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.632914 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.647913 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-d8jn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"55c909c3-a57a-4440-9052-48718b1d2dfd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-22l6k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-d8jn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.658655 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a46ef-2f49-471c-993c-f120cf187721\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.670436 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.686463 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.695408 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zp898" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a26ad31-dca7-4b95-80a8-d8a3db949d1a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8d837039d69f0c8a2af5005472c0904b25d6d9e4d08f588b32ce75d16dacafa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6p2gx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zp898\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.704108 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:59Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c38fa1c60e544c93d0222f6483f8f143c733e4fd44a619d3d872aaedf06b08b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.715177 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8510a852-14e1-4aba-826c-de9d4cfac290\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e54151febdf484446329938e8c738d1027062bd13ed00d3adf41e37f0e9d3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://65940b38653f640196377cdf6c52dabac29bf3708a6f546d7affca969f10c203\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjqbr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-26nll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.723918 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"332635bf-724e-479c-86bc-d08bb83cd6ef\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6nsx6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-q5phd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.727805 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.727841 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.727852 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.727868 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.727878 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.829774 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.829818 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.829830 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.829846 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.829858 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.847780 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.847822 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.847847 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.848083 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:03 crc kubenswrapper[4803]: E0320 17:18:03.848230 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:03 crc kubenswrapper[4803]: E0320 17:18:03.848309 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:03 crc kubenswrapper[4803]: E0320 17:18:03.848403 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:03 crc kubenswrapper[4803]: E0320 17:18:03.848828 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.938176 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.941638 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.941649 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.941665 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:03 crc kubenswrapper[4803]: I0320 17:18:03.941675 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:03Z","lastTransitionTime":"2026-03-20T17:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.044750 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.044794 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.044806 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.044824 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.044834 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.147413 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.147464 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.147478 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.147500 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.147513 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.250890 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.250975 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.250996 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.251022 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.251040 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.353482 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.353516 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.353538 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.353557 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.353568 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.457358 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.457438 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.457467 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.457501 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.457563 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.560267 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.560319 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.560338 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.560363 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.560380 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.662201 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.662698 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.662714 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.662739 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.662755 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.764817 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.764873 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.764889 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.764911 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.764926 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.866838 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.866875 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.866888 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.866905 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.866917 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.968662 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.968703 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.968714 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.968738 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:04 crc kubenswrapper[4803]: I0320 17:18:04.968749 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:04Z","lastTransitionTime":"2026-03-20T17:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.071847 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.071889 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.071899 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.071914 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.071922 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.174024 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.174076 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.174089 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.174106 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.174117 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.277174 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.277230 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.277245 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.277268 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.277284 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.310265 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"51e64549b1388eaea8610546abd177d7f42eb71cbc00ce1f97f22b108f5639d4"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.310312 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"59f800c21ffd58d9e0d1cd13cfe8128ec1dc06be2fdecd2b2ae06d37d0e8ed40"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.313075 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" event={"ID":"332635bf-724e-479c-86bc-d08bb83cd6ef","Type":"ContainerStarted","Data":"24cc6a050937523e42013503f52956e0f96289ab631e5917ad217880117453f0"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.313153 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" event={"ID":"332635bf-724e-479c-86bc-d08bb83cd6ef","Type":"ContainerStarted","Data":"3f746b3a31a9bf240d04347478804f01eb1376f772279cb485040f36c5bd8f84"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.331336 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T17:17:06Z\\\",\\\"message\\\":\\\"W0320 17:17:06.149831 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 17:17:06.150506 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774027026 cert, and key in /tmp/serving-cert-150131331/serving-signer.crt, /tmp/serving-cert-150131331/serving-signer.key\\\\nI0320 17:17:06.397114 1 observer_polling.go:159] Starting file observer\\\\nW0320 17:17:06.408021 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 17:17:06.408253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 17:17:06.409362 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-150131331/tls.crt::/tmp/serving-cert-150131331/tls.key\\\\\\\"\\\\nF0320 17:17:06.787671 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.348758 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce39fc1a0671faebd96f0f1425656967f792e9555efad98e1071da9c9463efa9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.369096 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4326b171-36ab-465f-ba67-a636b36f1f89\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s7vhj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4v5dx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.380166 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5swf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2c9586-ac9f-467a-a353-e43ac2a99797\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ae24fd87a74d0993e37ef277f11e00adc48c9b07429b5b34802a701171a0d5f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrlw5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5swf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.380493 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.380523 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.380550 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.380567 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.380576 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.391252 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.402820 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mbtw8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-56rll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.411780 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-llxn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"56b68c7b-2d4d-4628-9f1f-85ec48141f82\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:17:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-llxn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.422053 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d4a46ef-2f49-471c-993c-f120cf187721\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T17:16:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f7c530fb90e7ca3010c1dffece420b673f3e4d383ef735828c58ad060a94209\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T17:16:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5b0d8ab7e877a6bcc7bb7aad70d8efb5395952ce02056e273c73216724b82b82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T17:16:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T17:16:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T17:16:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.432802 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.444899 4803 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T17:17:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T17:18:05Z is after 2025-08-24T17:21:41Z" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.483199 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.483261 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.483283 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.483311 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.483332 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.484015 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-d8jn6" podStartSLOduration=51.483994697 podStartE2EDuration="51.483994697s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:05.467910644 +0000 UTC m=+95.379502724" watchObservedRunningTime="2026-03-20 17:18:05.483994697 +0000 UTC m=+95.395586777" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.498920 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podStartSLOduration=51.498894491 podStartE2EDuration="51.498894491s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:05.484137302 +0000 UTC m=+95.395729412" watchObservedRunningTime="2026-03-20 17:18:05.498894491 +0000 UTC m=+95.410486601" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.527322 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zp898" podStartSLOduration=51.527300371 podStartE2EDuration="51.527300371s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:05.526872106 +0000 UTC m=+95.438464166" watchObservedRunningTime="2026-03-20 17:18:05.527300371 +0000 UTC m=+95.438892481" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.561256 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-q5phd" podStartSLOduration=51.561231317 podStartE2EDuration="51.561231317s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:05.561035051 +0000 UTC m=+95.472627161" watchObservedRunningTime="2026-03-20 17:18:05.561231317 +0000 UTC m=+95.472823397" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.585404 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.585466 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.585485 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.585512 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.585563 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.632991 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l5swf" podStartSLOduration=51.632965801 podStartE2EDuration="51.632965801s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:05.632199836 +0000 UTC m=+95.543791936" watchObservedRunningTime="2026-03-20 17:18:05.632965801 +0000 UTC m=+95.544557911" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.633233 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podStartSLOduration=51.63322542 podStartE2EDuration="51.63322542s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:05.604324694 +0000 UTC m=+95.515916824" watchObservedRunningTime="2026-03-20 17:18:05.63322542 +0000 UTC m=+95.544817530" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.656879 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=28.656855079 podStartE2EDuration="28.656855079s" podCreationTimestamp="2026-03-20 17:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:05.65629708 +0000 UTC m=+95.567889190" watchObservedRunningTime="2026-03-20 17:18:05.656855079 +0000 UTC m=+95.568447169" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.688199 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.688236 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.688250 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.688265 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.688279 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.756858 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.756837698 podStartE2EDuration="7.756837698s" podCreationTimestamp="2026-03-20 17:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:05.756180336 +0000 UTC m=+95.667772406" watchObservedRunningTime="2026-03-20 17:18:05.756837698 +0000 UTC m=+95.668429788" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.790359 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.790398 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.790406 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.790420 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.790429 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.847589 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.847620 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.847633 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.847716 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:05 crc kubenswrapper[4803]: E0320 17:18:05.847739 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:05 crc kubenswrapper[4803]: E0320 17:18:05.847804 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:05 crc kubenswrapper[4803]: E0320 17:18:05.847871 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:18:05 crc kubenswrapper[4803]: E0320 17:18:05.847941 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.892124 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.892162 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.892174 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.892191 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.892202 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.994855 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.994925 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.994942 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.994968 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:05 crc kubenswrapper[4803]: I0320 17:18:05.994988 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:05Z","lastTransitionTime":"2026-03-20T17:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.097256 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.097298 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.097311 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.097331 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.097346 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.200579 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.200621 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.200632 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.200650 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.200663 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.303892 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.304286 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.304306 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.304335 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.304359 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.345943 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-llxn2"] Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.346032 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:06 crc kubenswrapper[4803]: E0320 17:18:06.346110 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.407049 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.407122 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.407143 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.407172 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.407194 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.509827 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.509869 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.509882 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.509899 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.509910 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.611792 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.611826 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.611836 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.611850 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.611861 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.714298 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.714329 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.714339 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.714353 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.714362 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.817022 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.817061 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.817070 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.817086 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.817096 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.919278 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.919361 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.919392 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.919425 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:06 crc kubenswrapper[4803]: I0320 17:18:06.919450 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:06Z","lastTransitionTime":"2026-03-20T17:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.021495 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.021541 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.021551 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.021564 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.021576 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.124170 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.124222 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.124237 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.124255 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.124269 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.227095 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.227458 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.227682 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.227948 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.228127 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.331052 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.331105 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.331124 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.331149 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.331168 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.434116 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.434168 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.434184 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.434207 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.434222 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.537079 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.537131 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.537144 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.537162 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.537174 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.639584 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.639654 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.639674 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.639699 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.639718 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.742104 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.742306 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.742363 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.742436 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.742507 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.844948 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.845577 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.845661 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.845742 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.845821 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.847508 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.847594 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.847601 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.847558 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:07 crc kubenswrapper[4803]: E0320 17:18:07.847721 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 17:18:07 crc kubenswrapper[4803]: E0320 17:18:07.847836 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 17:18:07 crc kubenswrapper[4803]: E0320 17:18:07.848499 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-llxn2" podUID="56b68c7b-2d4d-4628-9f1f-85ec48141f82" Mar 20 17:18:07 crc kubenswrapper[4803]: E0320 17:18:07.848667 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.947912 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.947959 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.947974 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.947995 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:07 crc kubenswrapper[4803]: I0320 17:18:07.948013 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:07Z","lastTransitionTime":"2026-03-20T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.052569 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.052601 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.052612 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.052629 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.052641 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.155397 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.155923 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.155935 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.155954 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.155966 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.246768 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.263444 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.263600 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.263620 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.263644 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.263661 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.329514 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerStarted","Data":"d9ae70b8ba09e486999ba472c1c6fd5f59df55802014fc20da8c49a84acba6be"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.369215 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.369268 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.369285 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.369305 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.369322 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.472289 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.472336 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.472349 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.472367 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.472381 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.574971 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.575010 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.575021 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.575036 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.575050 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.682569 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.682619 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.682630 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.682648 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.682663 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.786574 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.786619 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.786632 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.786649 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.786661 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.888623 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.888663 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.888687 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.888707 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.888720 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.991987 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.992034 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.992071 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.992096 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:08 crc kubenswrapper[4803]: I0320 17:18:08.992112 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:08Z","lastTransitionTime":"2026-03-20T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.094917 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.094979 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.094996 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.095022 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.095041 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:09Z","lastTransitionTime":"2026-03-20T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.199221 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.199277 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.199295 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.199319 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.199336 4803 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T17:18:09Z","lastTransitionTime":"2026-03-20T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.302851 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.302901 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.302918 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.302938 4803 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.303039 4803 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.342628 4803 generic.go:334] "Generic (PLEG): container finished" podID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" containerID="d9ae70b8ba09e486999ba472c1c6fd5f59df55802014fc20da8c49a84acba6be" exitCode=0 Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.342694 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerDied","Data":"d9ae70b8ba09e486999ba472c1c6fd5f59df55802014fc20da8c49a84acba6be"} Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.342742 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerStarted","Data":"61ff755d000baa98d3f04ea8603c213eb846d7566955bf3fdb7467be7cffeee8"} Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.376676 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hlq9h"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.377498 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.378161 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-28ntx"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.378817 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.379609 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.380601 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.395632 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.396421 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.397673 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.398370 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.399374 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.421260 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.421502 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.421792 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.421939 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.422037 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.422095 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.422425 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.424593 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsv6s"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.425090 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.426062 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ddzzp"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.426570 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ph6zk"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.426793 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.427108 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.429053 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.429317 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.429350 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.429468 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.429324 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.429817 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430139 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430149 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430220 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430295 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430386 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430479 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430573 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430699 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430736 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430781 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.430865 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.431011 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.437636 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.437873 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.440507 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.443780 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.443979 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x6qqk"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.444444 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zzf4k"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.444942 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.445263 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.446582 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.448628 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.448877 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.448910 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.449143 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.449208 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.449347 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.449389 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.449591 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.449692 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.449824 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.450131 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.450599 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.451006 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.451470 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mm9jg"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.451485 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.451667 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.451805 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.471457 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.472110 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.498423 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.498451 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.513763 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.513779 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.515636 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.515654 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.515796 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.515810 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.515928 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.515963 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.516040 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.516053 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.516125 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.516125 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.516184 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.516773 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.516929 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.517046 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.517098 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.517331 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zhp4m"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.517385 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.517705 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.517777 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.518036 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.518155 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mm9jg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.518371 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.522444 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.526068 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.526584 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.526650 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.526979 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.527069 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.527492 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.528450 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.529012 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-t6lj8"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.529635 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.529679 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.529838 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.530076 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.530191 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.530274 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.530368 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.530558 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.530631 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.530681 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.530954 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.531058 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.527683 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.531159 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.531358 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.531386 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.531509 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.531359 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.532085 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.534003 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.536916 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.538985 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-config\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.539409 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67c363d9-78c9-43a3-9189-9fb19ed0b384-metrics-tls\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.539670 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dttmf\" (UniqueName: \"kubernetes.io/projected/04eadf8a-430c-40c0-af91-f7bc1e02f220-kube-api-access-dttmf\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.539842 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.539954 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67c363d9-78c9-43a3-9189-9fb19ed0b384-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.539999 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04eadf8a-430c-40c0-af91-f7bc1e02f220-serving-cert\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540328 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540347 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2sg2\" (UniqueName: \"kubernetes.io/projected/c9636dcd-d0c5-4b44-96ed-aa4230163735-kube-api-access-s2sg2\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540367 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xft7v\" (UniqueName: \"kubernetes.io/projected/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-kube-api-access-xft7v\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540382 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04eadf8a-430c-40c0-af91-f7bc1e02f220-trusted-ca\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540396 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67c363d9-78c9-43a3-9189-9fb19ed0b384-trusted-ca\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540412 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-config\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540427 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540442 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540456 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-client-ca\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540469 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9636dcd-d0c5-4b44-96ed-aa4230163735-serving-cert\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540506 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-auth-proxy-config\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540544 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-machine-approver-tls\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540598 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540620 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drvn5\" (UniqueName: \"kubernetes.io/projected/374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3-kube-api-access-drvn5\") pod \"downloads-7954f5f757-mm9jg\" (UID: \"374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3\") " pod="openshift-console/downloads-7954f5f757-mm9jg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540636 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srlr4\" (UniqueName: \"kubernetes.io/projected/67c363d9-78c9-43a3-9189-9fb19ed0b384-kube-api-access-srlr4\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540653 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.540668 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04eadf8a-430c-40c0-af91-f7bc1e02f220-config\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.542631 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.542912 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.542998 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.542936 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.543174 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.543292 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.543438 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.543580 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.545346 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.545483 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.545576 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.545727 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.545853 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.545975 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.546067 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.546110 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.546305 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.546197 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.546254 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.546485 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.546570 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.546668 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.547156 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.548390 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.548475 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g6tsw"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.577178 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rb6js"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.594901 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.595236 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.596688 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.597715 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.598244 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.598431 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.600054 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.600125 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.601374 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.601710 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r9r29"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.602328 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567118-clg6s"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.602887 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.603360 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.603638 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.603841 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567118-clg6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.603952 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.604174 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.604294 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.604806 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.605266 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.605453 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.607973 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwfxw"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.609029 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.611144 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.611929 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.615255 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.616353 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.620633 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.621790 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.623125 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.624755 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.624840 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.626363 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.626540 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.626818 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.627021 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hsljx"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.627248 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.627467 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.627706 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-28ntx"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.629298 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.630890 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hlq9h"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.632328 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-97tz2"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.632743 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.633182 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x6qqk"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.634696 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ddzzp"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.635699 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.637088 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-bxk24"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.637741 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.638397 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nsp5p"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.639379 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.640426 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.641618 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9b64128-c6d2-471f-84ac-84fb6b17ea78-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wt7sp\" (UID: \"b9b64128-c6d2-471f-84ac-84fb6b17ea78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.641693 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-encryption-config\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.641776 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-config\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.641803 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86hmf\" (UniqueName: \"kubernetes.io/projected/f3f47d35-b096-47cb-879d-05004b9cbcf4-kube-api-access-86hmf\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.641858 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784d72fc-b506-4908-9c03-0b696d082014-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.641890 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2sg2\" (UniqueName: \"kubernetes.io/projected/c9636dcd-d0c5-4b44-96ed-aa4230163735-kube-api-access-s2sg2\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.641976 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.641997 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c5499-4d33-4657-8a0b-31abe59e5516-config\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642015 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6qkw\" (UniqueName: \"kubernetes.io/projected/b9b64128-c6d2-471f-84ac-84fb6b17ea78-kube-api-access-d6qkw\") pod \"cluster-samples-operator-665b6dd947-wt7sp\" (UID: \"b9b64128-c6d2-471f-84ac-84fb6b17ea78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642093 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-audit\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642112 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w9kb\" (UniqueName: \"kubernetes.io/projected/d2e71ab2-6214-4a0d-8745-2e5864a491b3-kube-api-access-9w9kb\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642127 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784d72fc-b506-4908-9c03-0b696d082014-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642144 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xft7v\" (UniqueName: \"kubernetes.io/projected/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-kube-api-access-xft7v\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642160 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-serving-cert\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642177 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-client\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642193 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642274 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04eadf8a-430c-40c0-af91-f7bc1e02f220-trusted-ca\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642294 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642310 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-ca\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642330 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67c363d9-78c9-43a3-9189-9fb19ed0b384-trusted-ca\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642347 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-config\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642363 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642436 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8m96\" (UniqueName: \"kubernetes.io/projected/3ce102a5-845c-4e54-ba79-cbf4f76e3341-kube-api-access-v8m96\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642458 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-config\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642477 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642494 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642554 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-etcd-client\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642575 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642670 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642689 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642706 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642724 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/feb9a05b-2c34-4ad7-8316-e566b399a613-metrics-tls\") pod \"dns-operator-744455d44c-x6qqk\" (UID: \"feb9a05b-2c34-4ad7-8316-e566b399a613\") " pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642740 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-serving-cert\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642759 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js4nl\" (UniqueName: \"kubernetes.io/projected/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-kube-api-access-js4nl\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642784 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-client-ca\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642800 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9636dcd-d0c5-4b44-96ed-aa4230163735-serving-cert\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642816 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmrvd\" (UniqueName: \"kubernetes.io/projected/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-kube-api-access-pmrvd\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642833 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-stats-auth\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642850 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642958 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-config\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642978 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-auth-proxy-config\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.642995 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643011 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2e71ab2-6214-4a0d-8745-2e5864a491b3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643027 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643044 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-oauth-serving-cert\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643061 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-machine-approver-tls\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643077 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-config\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643095 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-metrics-certs\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643119 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643135 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3f47d35-b096-47cb-879d-05004b9cbcf4-images\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643157 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftvg\" (UniqueName: \"kubernetes.io/projected/80befa86-f1dc-4d44-8d9a-7b50b557159d-kube-api-access-gftvg\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643175 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2e71ab2-6214-4a0d-8745-2e5864a491b3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643190 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643205 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbhs7\" (UniqueName: \"kubernetes.io/projected/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-kube-api-access-sbhs7\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643221 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-serving-cert\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643237 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643252 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgql4\" (UniqueName: \"kubernetes.io/projected/baff85ab-57bf-49c5-8009-938ad47246aa-kube-api-access-tgql4\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643267 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2e71ab2-6214-4a0d-8745-2e5864a491b3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643281 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lvmd\" (UniqueName: \"kubernetes.io/projected/3c225286-1127-48e6-ae17-a55d8f21904e-kube-api-access-8lvmd\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643297 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643393 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc46m\" (UniqueName: \"kubernetes.io/projected/784d72fc-b506-4908-9c03-0b696d082014-kube-api-access-nc46m\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643409 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-oauth-config\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643426 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-audit-policies\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643448 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643466 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-audit-policies\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643485 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643502 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f13296fc-7b19-43e5-9f80-08502dee6f1b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k5nvk\" (UID: \"f13296fc-7b19-43e5-9f80-08502dee6f1b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643510 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567118-clg6s"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643571 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643583 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ph6zk"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643538 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttwmw\" (UniqueName: \"kubernetes.io/projected/f13296fc-7b19-43e5-9f80-08502dee6f1b-kube-api-access-ttwmw\") pod \"control-plane-machine-set-operator-78cbb6b69f-k5nvk\" (UID: \"f13296fc-7b19-43e5-9f80-08502dee6f1b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643653 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643702 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643724 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-etcd-serving-ca\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643739 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-config\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643758 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wtpv\" (UniqueName: \"kubernetes.io/projected/e60f36ac-4efd-493f-9903-a0311c9d6216-kube-api-access-9wtpv\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643776 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04eadf8a-430c-40c0-af91-f7bc1e02f220-trusted-ca\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643797 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643826 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drvn5\" (UniqueName: \"kubernetes.io/projected/374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3-kube-api-access-drvn5\") pod \"downloads-7954f5f757-mm9jg\" (UID: \"374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3\") " pod="openshift-console/downloads-7954f5f757-mm9jg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643862 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/baff85ab-57bf-49c5-8009-938ad47246aa-audit-dir\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643883 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-encryption-config\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643902 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srlr4\" (UniqueName: \"kubernetes.io/projected/67c363d9-78c9-43a3-9189-9fb19ed0b384-kube-api-access-srlr4\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643942 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.643961 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-service-ca\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644067 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv2pl\" (UniqueName: \"kubernetes.io/projected/353c5499-4d33-4657-8a0b-31abe59e5516-kube-api-access-qv2pl\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644152 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04eadf8a-430c-40c0-af91-f7bc1e02f220-config\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644193 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3f47d35-b096-47cb-879d-05004b9cbcf4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644210 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-service-ca\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644254 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644274 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-config\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644315 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-config\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644332 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-image-import-ca\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644351 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644355 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67c363d9-78c9-43a3-9189-9fb19ed0b384-metrics-tls\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644409 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dttmf\" (UniqueName: \"kubernetes.io/projected/04eadf8a-430c-40c0-af91-f7bc1e02f220-kube-api-access-dttmf\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644436 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-serving-cert\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644471 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644493 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c225286-1127-48e6-ae17-a55d8f21904e-audit-dir\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644511 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zszc6\" (UniqueName: \"kubernetes.io/projected/feb9a05b-2c34-4ad7-8316-e566b399a613-kube-api-access-zszc6\") pod \"dns-operator-744455d44c-x6qqk\" (UID: \"feb9a05b-2c34-4ad7-8316-e566b399a613\") " pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644550 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67c363d9-78c9-43a3-9189-9fb19ed0b384-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644567 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644585 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-serving-cert\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644601 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80befa86-f1dc-4d44-8d9a-7b50b557159d-audit-dir\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644619 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975zn\" (UniqueName: \"kubernetes.io/projected/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-kube-api-access-975zn\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644635 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-default-certificate\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644656 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04eadf8a-430c-40c0-af91-f7bc1e02f220-serving-cert\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644674 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644696 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-trusted-ca-bundle\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.644802 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/67c363d9-78c9-43a3-9189-9fb19ed0b384-trusted-ca\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.646912 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-config\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.647948 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.648954 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.649029 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04eadf8a-430c-40c0-af91-f7bc1e02f220-config\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.651488 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.651584 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zzf4k"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.651600 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.652262 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-client-ca\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.652984 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.654139 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/67c363d9-78c9-43a3-9189-9fb19ed0b384-metrics-tls\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.654798 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.655109 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.655922 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80befa86-f1dc-4d44-8d9a-7b50b557159d-node-pullsecrets\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656150 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656239 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656368 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656421 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-config\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656459 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pccqm\" (UniqueName: \"kubernetes.io/projected/f021b6de-4ba5-4116-a49f-d12a677f1746-kube-api-access-pccqm\") pod \"migrator-59844c95c7-xf7g5\" (UID: \"f021b6de-4ba5-4116-a49f-d12a677f1746\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656553 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e60f36ac-4efd-493f-9903-a0311c9d6216-service-ca-bundle\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656583 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656647 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.656923 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ce102a5-845c-4e54-ba79-cbf4f76e3341-serving-cert\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.657006 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-client-ca\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.659202 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rjn\" (UniqueName: \"kubernetes.io/projected/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-kube-api-access-88rjn\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.659251 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-service-ca-bundle\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.659284 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f47d35-b096-47cb-879d-05004b9cbcf4-config\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.659340 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/353c5499-4d33-4657-8a0b-31abe59e5516-serving-cert\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.659441 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mm9jg"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.660000 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.660265 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-etcd-client\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.660792 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-auth-proxy-config\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.661302 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-config\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.661713 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.662811 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.666822 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bxk24"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.666882 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsv6s"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.669878 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.671479 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.683332 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.683887 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04eadf8a-430c-40c0-af91-f7bc1e02f220-serving-cert\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.683963 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-machine-approver-tls\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.685075 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.685560 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9636dcd-d0c5-4b44-96ed-aa4230163735-serving-cert\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.685594 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.686057 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.687138 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6s6cl"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.688415 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.688616 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.689302 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g6tsw"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.690242 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hsljx"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.691110 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.691918 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rb6js"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.692779 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9vjd5"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.693618 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9vjd5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.693621 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.694512 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nsp5p"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.695476 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-t6lj8"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.696320 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.697158 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.698090 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.698958 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.700201 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9vjd5"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.700695 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zhp4m"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.701586 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwfxw"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.702400 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.703344 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc"] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.705647 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.725100 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.745505 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.760932 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761121 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-certs\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761160 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zszc6\" (UniqueName: \"kubernetes.io/projected/feb9a05b-2c34-4ad7-8316-e566b399a613-kube-api-access-zszc6\") pod \"dns-operator-744455d44c-x6qqk\" (UID: \"feb9a05b-2c34-4ad7-8316-e566b399a613\") " pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761185 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c225286-1127-48e6-ae17-a55d8f21904e-audit-dir\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761209 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80befa86-f1dc-4d44-8d9a-7b50b557159d-audit-dir\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761235 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-975zn\" (UniqueName: \"kubernetes.io/projected/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-kube-api-access-975zn\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761260 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-default-certificate\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761284 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl858\" (UniqueName: \"kubernetes.io/projected/ea5c4417-cb46-4e98-be4f-473a658bd123-kube-api-access-kl858\") pod \"multus-admission-controller-857f4d67dd-rb6js\" (UID: \"ea5c4417-cb46-4e98-be4f-473a658bd123\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761309 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-serving-cert\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761331 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-trusted-ca-bundle\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761352 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-plugins-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761376 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761398 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80befa86-f1dc-4d44-8d9a-7b50b557159d-node-pullsecrets\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761420 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e60f36ac-4efd-493f-9903-a0311c9d6216-service-ca-bundle\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761442 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761464 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-config\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761541 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761567 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-client-ca\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761588 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f47d35-b096-47cb-879d-05004b9cbcf4-config\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761616 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-etcd-client\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761638 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c607a233-38be-49b0-9953-b6416f879c2e-metrics-tls\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761661 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784d72fc-b506-4908-9c03-0b696d082014-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761685 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-socket-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761707 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-config\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761730 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86hmf\" (UniqueName: \"kubernetes.io/projected/f3f47d35-b096-47cb-879d-05004b9cbcf4-kube-api-access-86hmf\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761757 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761778 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c5499-4d33-4657-8a0b-31abe59e5516-config\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761799 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-audit\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761823 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-client\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761844 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761867 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-images\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761891 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-ca\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761911 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-config\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761935 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761958 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5822cec-11d7-4d8f-a5cb-d78527689fe8-tmpfs\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.761994 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762062 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmrvd\" (UniqueName: \"kubernetes.io/projected/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-kube-api-access-pmrvd\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762130 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762183 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-oauth-serving-cert\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762210 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2e71ab2-6214-4a0d-8745-2e5864a491b3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762234 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg7z6\" (UniqueName: \"kubernetes.io/projected/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-kube-api-access-pg7z6\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762255 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-apiservice-cert\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762278 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-config\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762301 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-metrics-certs\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762325 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3f47d35-b096-47cb-879d-05004b9cbcf4-images\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762348 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-node-bootstrap-token\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762372 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762395 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbhs7\" (UniqueName: \"kubernetes.io/projected/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-kube-api-access-sbhs7\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762427 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762450 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2e71ab2-6214-4a0d-8745-2e5864a491b3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762475 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762498 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-etcd-serving-ca\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762535 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-config\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762563 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-secret-volume\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762593 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/baff85ab-57bf-49c5-8009-938ad47246aa-audit-dir\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762617 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-service-ca\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762640 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv2pl\" (UniqueName: \"kubernetes.io/projected/353c5499-4d33-4657-8a0b-31abe59e5516-kube-api-access-qv2pl\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762666 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqcn4\" (UniqueName: \"kubernetes.io/projected/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-kube-api-access-mqcn4\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762697 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3f47d35-b096-47cb-879d-05004b9cbcf4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762720 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-mountpoint-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762745 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762769 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762794 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762819 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-config\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762841 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/398ed7a7-a832-4da9-bddb-45158a16cbd6-proxy-tls\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.762865 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-image-import-ca\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.762917 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:41.762894331 +0000 UTC m=+131.674486401 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763411 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-serving-cert\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763576 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-csi-data-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763646 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-proxy-tls\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763670 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-webhook-cert\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763740 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763812 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763849 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763948 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.763977 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pccqm\" (UniqueName: \"kubernetes.io/projected/f021b6de-4ba5-4116-a49f-d12a677f1746-kube-api-access-pccqm\") pod \"migrator-59844c95c7-xf7g5\" (UID: \"f021b6de-4ba5-4116-a49f-d12a677f1746\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764018 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ce102a5-845c-4e54-ba79-cbf4f76e3341-serving-cert\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764042 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88rjn\" (UniqueName: \"kubernetes.io/projected/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-kube-api-access-88rjn\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764120 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-service-ca-bundle\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764142 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/353c5499-4d33-4657-8a0b-31abe59e5516-serving-cert\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764164 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764184 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9b64128-c6d2-471f-84ac-84fb6b17ea78-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wt7sp\" (UID: \"b9b64128-c6d2-471f-84ac-84fb6b17ea78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764204 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-encryption-config\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764229 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f4999b37-8783-4029-b4a5-7b8aa468e234-srv-cert\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764257 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6qkw\" (UniqueName: \"kubernetes.io/projected/b9b64128-c6d2-471f-84ac-84fb6b17ea78-kube-api-access-d6qkw\") pod \"cluster-samples-operator-665b6dd947-wt7sp\" (UID: \"b9b64128-c6d2-471f-84ac-84fb6b17ea78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764277 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w9kb\" (UniqueName: \"kubernetes.io/projected/d2e71ab2-6214-4a0d-8745-2e5864a491b3-kube-api-access-9w9kb\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764296 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784d72fc-b506-4908-9c03-0b696d082014-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764317 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnw6h\" (UniqueName: \"kubernetes.io/projected/f4999b37-8783-4029-b4a5-7b8aa468e234-kube-api-access-dnw6h\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764337 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764356 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea5c4417-cb46-4e98-be4f-473a658bd123-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rb6js\" (UID: \"ea5c4417-cb46-4e98-be4f-473a658bd123\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764382 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-serving-cert\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764401 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764421 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8m96\" (UniqueName: \"kubernetes.io/projected/3ce102a5-845c-4e54-ba79-cbf4f76e3341-kube-api-access-v8m96\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764443 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-etcd-client\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764459 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764479 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764496 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764515 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764576 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/feb9a05b-2c34-4ad7-8316-e566b399a613-metrics-tls\") pod \"dns-operator-744455d44c-x6qqk\" (UID: \"feb9a05b-2c34-4ad7-8316-e566b399a613\") " pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764595 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-serving-cert\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764611 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js4nl\" (UniqueName: \"kubernetes.io/projected/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-kube-api-access-js4nl\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764610 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3c225286-1127-48e6-ae17-a55d8f21904e-audit-dir\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764632 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-registration-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764652 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9rtk\" (UniqueName: \"kubernetes.io/projected/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-kube-api-access-s9rtk\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764656 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80befa86-f1dc-4d44-8d9a-7b50b557159d-audit-dir\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764704 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hppjg\" (UniqueName: \"kubernetes.io/projected/d0e2699e-927c-4274-9bcd-f20d91af15e5-kube-api-access-hppjg\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764745 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-stats-auth\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764763 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-config\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764782 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vc4q\" (UniqueName: \"kubernetes.io/projected/50a15cf7-9fc3-45dc-a960-a85e930f8365-kube-api-access-4vc4q\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764800 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764821 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764844 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764862 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f4999b37-8783-4029-b4a5-7b8aa468e234-profile-collector-cert\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764877 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm9cg\" (UniqueName: \"kubernetes.io/projected/c5822cec-11d7-4d8f-a5cb-d78527689fe8-kube-api-access-cm9cg\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764898 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764917 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764934 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gftvg\" (UniqueName: \"kubernetes.io/projected/80befa86-f1dc-4d44-8d9a-7b50b557159d-kube-api-access-gftvg\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764954 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2e71ab2-6214-4a0d-8745-2e5864a491b3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764977 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764997 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgql4\" (UniqueName: \"kubernetes.io/projected/baff85ab-57bf-49c5-8009-938ad47246aa-kube-api-access-tgql4\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765014 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lvmd\" (UniqueName: \"kubernetes.io/projected/3c225286-1127-48e6-ae17-a55d8f21904e-kube-api-access-8lvmd\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765032 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc46m\" (UniqueName: \"kubernetes.io/projected/784d72fc-b506-4908-9c03-0b696d082014-kube-api-access-nc46m\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765049 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-oauth-config\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765066 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ztqg\" (UniqueName: \"kubernetes.io/projected/96658fb9-4742-457e-b7ec-384ef06ec6a8-kube-api-access-7ztqg\") pod \"auto-csr-approver-29567118-clg6s\" (UID: \"96658fb9-4742-457e-b7ec-384ef06ec6a8\") " pod="openshift-infra/auto-csr-approver-29567118-clg6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765085 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-serving-cert\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765101 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-audit-policies\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765118 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-audit-policies\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765136 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765154 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f13296fc-7b19-43e5-9f80-08502dee6f1b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k5nvk\" (UID: \"f13296fc-7b19-43e5-9f80-08502dee6f1b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765173 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttwmw\" (UniqueName: \"kubernetes.io/projected/f13296fc-7b19-43e5-9f80-08502dee6f1b-kube-api-access-ttwmw\") pod \"control-plane-machine-set-operator-78cbb6b69f-k5nvk\" (UID: \"f13296fc-7b19-43e5-9f80-08502dee6f1b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765191 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765211 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxsgd\" (UniqueName: \"kubernetes.io/projected/c607a233-38be-49b0-9953-b6416f879c2e-kube-api-access-lxsgd\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765230 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wtpv\" (UniqueName: \"kubernetes.io/projected/e60f36ac-4efd-493f-9903-a0311c9d6216-kube-api-access-9wtpv\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765246 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765271 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-encryption-config\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765288 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765308 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-service-ca\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765327 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fwg2\" (UniqueName: \"kubernetes.io/projected/398ed7a7-a832-4da9-bddb-45158a16cbd6-kube-api-access-5fwg2\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765344 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c607a233-38be-49b0-9953-b6416f879c2e-config-volume\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765362 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv25f\" (UniqueName: \"kubernetes.io/projected/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-kube-api-access-cv25f\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.765377 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.766006 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784d72fc-b506-4908-9c03-0b696d082014-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.766102 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-ca\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.766471 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/80befa86-f1dc-4d44-8d9a-7b50b557159d-node-pullsecrets\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.766661 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/baff85ab-57bf-49c5-8009-938ad47246aa-audit-dir\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.766691 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-config\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.767456 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-service-ca\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.767576 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-config\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.767905 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.768168 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-client-ca\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.768370 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-serving-cert\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.769252 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-config\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.769327 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-config\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.769473 4803 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.769598 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:41.769581787 +0000 UTC m=+131.681173857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.769598 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.769675 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.769683 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3f47d35-b096-47cb-879d-05004b9cbcf4-config\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.770350 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.770442 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.770441 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.770891 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.771369 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f3f47d35-b096-47cb-879d-05004b9cbcf4-images\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.771614 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-etcd-serving-ca\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.771824 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.772305 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-audit-policies\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.772907 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-audit-policies\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.772926 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.773256 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.773486 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d2e71ab2-6214-4a0d-8745-2e5864a491b3-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.773696 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c225286-1127-48e6-ae17-a55d8f21904e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.773937 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-audit\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.774030 4803 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.764019 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/80befa86-f1dc-4d44-8d9a-7b50b557159d-image-import-ca\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.774093 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9b64128-c6d2-471f-84ac-84fb6b17ea78-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wt7sp\" (UID: \"b9b64128-c6d2-471f-84ac-84fb6b17ea78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.774367 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:41.774353708 +0000 UTC m=+131.685945788 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.774682 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.774759 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.774835 4803 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.775004 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.775510 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-encryption-config\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.775647 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/784d72fc-b506-4908-9c03-0b696d082014-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:09 crc kubenswrapper[4803]: E0320 17:18:09.775758 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:41.775743585 +0000 UTC m=+131.687335655 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.775826 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ce102a5-845c-4e54-ba79-cbf4f76e3341-service-ca-bundle\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.776357 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.777848 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d2e71ab2-6214-4a0d-8745-2e5864a491b3-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.780973 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.780980 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-serving-cert\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.781162 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.781428 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-encryption-config\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.781441 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-etcd-client\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.781924 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.781966 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/80befa86-f1dc-4d44-8d9a-7b50b557159d-etcd-client\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.782088 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/feb9a05b-2c34-4ad7-8316-e566b399a613-metrics-tls\") pod \"dns-operator-744455d44c-x6qqk\" (UID: \"feb9a05b-2c34-4ad7-8316-e566b399a613\") " pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.782189 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f3f47d35-b096-47cb-879d-05004b9cbcf4-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.782321 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-etcd-client\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.782463 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.782682 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.782845 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-serving-cert\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.782992 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.783025 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.783096 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-serving-cert\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.783324 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c225286-1127-48e6-ae17-a55d8f21904e-serving-cert\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.785005 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3ce102a5-845c-4e54-ba79-cbf4f76e3341-serving-cert\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.785155 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.805499 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.808953 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-config\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.825328 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.843286 4803 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.844734 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.847426 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.847444 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.847460 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.847487 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.864837 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867157 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867192 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea5c4417-cb46-4e98-be4f-473a658bd123-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rb6js\" (UID: \"ea5c4417-cb46-4e98-be4f-473a658bd123\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867241 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-registration-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867259 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9rtk\" (UniqueName: \"kubernetes.io/projected/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-kube-api-access-s9rtk\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867284 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hppjg\" (UniqueName: \"kubernetes.io/projected/d0e2699e-927c-4274-9bcd-f20d91af15e5-kube-api-access-hppjg\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867330 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vc4q\" (UniqueName: \"kubernetes.io/projected/50a15cf7-9fc3-45dc-a960-a85e930f8365-kube-api-access-4vc4q\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867352 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867631 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-registration-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.867395 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f4999b37-8783-4029-b4a5-7b8aa468e234-profile-collector-cert\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.868917 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm9cg\" (UniqueName: \"kubernetes.io/projected/c5822cec-11d7-4d8f-a5cb-d78527689fe8-kube-api-access-cm9cg\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.868965 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ztqg\" (UniqueName: \"kubernetes.io/projected/96658fb9-4742-457e-b7ec-384ef06ec6a8-kube-api-access-7ztqg\") pod \"auto-csr-approver-29567118-clg6s\" (UID: \"96658fb9-4742-457e-b7ec-384ef06ec6a8\") " pod="openshift-infra/auto-csr-approver-29567118-clg6s" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869000 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxsgd\" (UniqueName: \"kubernetes.io/projected/c607a233-38be-49b0-9953-b6416f879c2e-kube-api-access-lxsgd\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869029 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869052 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fwg2\" (UniqueName: \"kubernetes.io/projected/398ed7a7-a832-4da9-bddb-45158a16cbd6-kube-api-access-5fwg2\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869069 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c607a233-38be-49b0-9953-b6416f879c2e-config-volume\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869085 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv25f\" (UniqueName: \"kubernetes.io/projected/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-kube-api-access-cv25f\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869101 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869126 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-certs\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869166 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl858\" (UniqueName: \"kubernetes.io/projected/ea5c4417-cb46-4e98-be4f-473a658bd123-kube-api-access-kl858\") pod \"multus-admission-controller-857f4d67dd-rb6js\" (UID: \"ea5c4417-cb46-4e98-be4f-473a658bd123\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869190 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-plugins-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869226 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c607a233-38be-49b0-9953-b6416f879c2e-metrics-tls\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869251 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-socket-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869273 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-images\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869297 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5822cec-11d7-4d8f-a5cb-d78527689fe8-tmpfs\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869313 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869331 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869331 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-plugins-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869368 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg7z6\" (UniqueName: \"kubernetes.io/projected/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-kube-api-access-pg7z6\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869386 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-apiservice-cert\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869402 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-node-bootstrap-token\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869343 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-socket-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869443 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-secret-volume\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869587 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqcn4\" (UniqueName: \"kubernetes.io/projected/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-kube-api-access-mqcn4\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869640 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-mountpoint-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869662 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.870470 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.870499 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.870538 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/398ed7a7-a832-4da9-bddb-45158a16cbd6-proxy-tls\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.870617 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-csi-data-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.870681 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-proxy-tls\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.870708 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-webhook-cert\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.870807 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f4999b37-8783-4029-b4a5-7b8aa468e234-srv-cert\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.870857 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnw6h\" (UniqueName: \"kubernetes.io/projected/f4999b37-8783-4029-b4a5-7b8aa468e234-kube-api-access-dnw6h\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869737 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5822cec-11d7-4d8f-a5cb-d78527689fe8-tmpfs\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.869887 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-mountpoint-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.871041 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.871070 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d0e2699e-927c-4274-9bcd-f20d91af15e5-csi-data-dir\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.885249 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.905184 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.910962 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.924832 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.945732 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.946968 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-config\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.965820 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 17:18:09 crc kubenswrapper[4803]: I0320 17:18:09.984950 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.005610 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.024830 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.031916 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-config\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.045151 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.055224 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-oauth-config\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.065127 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.076139 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-service-ca\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.091507 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.101013 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-trusted-ca-bundle\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.111819 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.116693 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f13296fc-7b19-43e5-9f80-08502dee6f1b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k5nvk\" (UID: \"f13296fc-7b19-43e5-9f80-08502dee6f1b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.125203 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.133347 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-oauth-serving-cert\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.145553 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.157906 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-serving-cert\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.165854 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.185041 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.205917 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.214281 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.225802 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.246291 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.280086 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.286272 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.289946 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.304989 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.326452 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.331746 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ea5c4417-cb46-4e98-be4f-473a658bd123-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-rb6js\" (UID: \"ea5c4417-cb46-4e98-be4f-473a658bd123\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.346197 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.348780 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerDied","Data":"61ff755d000baa98d3f04ea8603c213eb846d7566955bf3fdb7467be7cffeee8"} Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.348779 4803 generic.go:334] "Generic (PLEG): container finished" podID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" containerID="61ff755d000baa98d3f04ea8603c213eb846d7566955bf3fdb7467be7cffeee8" exitCode=0 Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.365941 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.386829 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.396959 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f4999b37-8783-4029-b4a5-7b8aa468e234-srv-cert\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.406479 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.418701 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-secret-volume\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.418764 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f4999b37-8783-4029-b4a5-7b8aa468e234-profile-collector-cert\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.426651 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.445331 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.466026 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.485829 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.505729 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.527154 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.538809 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.545070 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.555130 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.566081 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.585706 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.599426 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-default-certificate\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.608818 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.623698 4803 request.go:700] Waited for 1.019466962s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-metrics-certs-default&limit=500&resourceVersion=0 Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.624673 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-stats-auth\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.625929 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.646824 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.648434 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e60f36ac-4efd-493f-9903-a0311c9d6216-metrics-certs\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.650197 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e60f36ac-4efd-493f-9903-a0311c9d6216-service-ca-bundle\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.667108 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.684799 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.708156 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.715254 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-proxy-tls\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.724861 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.745013 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.765270 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.769100 4803 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.769228 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/353c5499-4d33-4657-8a0b-31abe59e5516-config podName:353c5499-4d33-4657-8a0b-31abe59e5516 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.269191398 +0000 UTC m=+101.180783528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/353c5499-4d33-4657-8a0b-31abe59e5516-config") pod "service-ca-operator-777779d784-8h6mv" (UID: "353c5499-4d33-4657-8a0b-31abe59e5516") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.775208 4803 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.775388 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/353c5499-4d33-4657-8a0b-31abe59e5516-serving-cert podName:353c5499-4d33-4657-8a0b-31abe59e5516 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.275375197 +0000 UTC m=+101.186967277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/353c5499-4d33-4657-8a0b-31abe59e5516-serving-cert") pod "service-ca-operator-777779d784-8h6mv" (UID: "353c5499-4d33-4657-8a0b-31abe59e5516") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.785578 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.793718 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.805803 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.810496 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.825315 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.849900 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.866178 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.867921 4803 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.868015 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume podName:ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.367992297 +0000 UTC m=+101.279584457 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume") pod "collect-profiles-29567115-f7sgg" (UID: "ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869431 4803 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869472 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c607a233-38be-49b0-9953-b6416f879c2e-config-volume podName:c607a233-38be-49b0-9953-b6416f879c2e nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.369460646 +0000 UTC m=+101.281052716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/c607a233-38be-49b0-9953-b6416f879c2e-config-volume") pod "dns-default-bxk24" (UID: "c607a233-38be-49b0-9953-b6416f879c2e") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869498 4803 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869558 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c607a233-38be-49b0-9953-b6416f879c2e-metrics-tls podName:c607a233-38be-49b0-9953-b6416f879c2e nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.369548559 +0000 UTC m=+101.281140629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/c607a233-38be-49b0-9953-b6416f879c2e-metrics-tls") pod "dns-default-bxk24" (UID: "c607a233-38be-49b0-9953-b6416f879c2e") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869586 4803 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869606 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-node-bootstrap-token podName:a4b81c15-7f14-40fe-bfa1-49c6514ff28d nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.369600771 +0000 UTC m=+101.281192841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-node-bootstrap-token") pod "machine-config-server-97tz2" (UID: "a4b81c15-7f14-40fe-bfa1-49c6514ff28d") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869620 4803 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869639 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-certs podName:a4b81c15-7f14-40fe-bfa1-49c6514ff28d nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.369633822 +0000 UTC m=+101.281225892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-certs") pod "machine-config-server-97tz2" (UID: "a4b81c15-7f14-40fe-bfa1-49c6514ff28d") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869654 4803 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869674 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-apiservice-cert podName:c5822cec-11d7-4d8f-a5cb-d78527689fe8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.369668203 +0000 UTC m=+101.281260273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-apiservice-cert") pod "packageserver-d55dfcdfc-v5h9f" (UID: "c5822cec-11d7-4d8f-a5cb-d78527689fe8") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.869719 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.870278 4803 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.870491 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-images podName:398ed7a7-a832-4da9-bddb-45158a16cbd6 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.37046249 +0000 UTC m=+101.282054600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-images") pod "machine-config-operator-74547568cd-xzzxc" (UID: "398ed7a7-a832-4da9-bddb-45158a16cbd6") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.872846 4803 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.872938 4803 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.872948 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-webhook-cert podName:c5822cec-11d7-4d8f-a5cb-d78527689fe8 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.372924513 +0000 UTC m=+101.284516593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-webhook-cert") pod "packageserver-d55dfcdfc-v5h9f" (UID: "c5822cec-11d7-4d8f-a5cb-d78527689fe8") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: E0320 17:18:10.873027 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/398ed7a7-a832-4da9-bddb-45158a16cbd6-proxy-tls podName:398ed7a7-a832-4da9-bddb-45158a16cbd6 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:11.373007096 +0000 UTC m=+101.284599176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/398ed7a7-a832-4da9-bddb-45158a16cbd6-proxy-tls") pod "machine-config-operator-74547568cd-xzzxc" (UID: "398ed7a7-a832-4da9-bddb-45158a16cbd6") : failed to sync secret cache: timed out waiting for the condition Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.885322 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.904992 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.924987 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.946031 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 17:18:10 crc kubenswrapper[4803]: I0320 17:18:10.965647 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.005354 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.026373 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.045749 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.065724 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.085067 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.106039 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.126800 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.145739 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.165709 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.186177 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.205143 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.225130 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.245736 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.266512 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.286731 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.300262 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/353c5499-4d33-4657-8a0b-31abe59e5516-serving-cert\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.300787 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c5499-4d33-4657-8a0b-31abe59e5516-config\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.303006 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353c5499-4d33-4657-8a0b-31abe59e5516-config\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.306582 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.307057 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/353c5499-4d33-4657-8a0b-31abe59e5516-serving-cert\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.326327 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.345775 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.356031 4803 generic.go:334] "Generic (PLEG): container finished" podID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" containerID="5b179634aba7766a916e647ffc512628516bcd475623b3a101d4fd4ad3a9e71c" exitCode=0 Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.356090 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerDied","Data":"5b179634aba7766a916e647ffc512628516bcd475623b3a101d4fd4ad3a9e71c"} Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.365435 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.389240 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402154 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402443 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c607a233-38be-49b0-9953-b6416f879c2e-config-volume\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402513 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-certs\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402634 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c607a233-38be-49b0-9953-b6416f879c2e-metrics-tls\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402684 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-images\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402759 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-apiservice-cert\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402804 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-node-bootstrap-token\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402947 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/398ed7a7-a832-4da9-bddb-45158a16cbd6-proxy-tls\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.402983 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-webhook-cert\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.403510 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.404424 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/398ed7a7-a832-4da9-bddb-45158a16cbd6-images\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.405583 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c607a233-38be-49b0-9953-b6416f879c2e-config-volume\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.406490 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.407762 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-webhook-cert\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.409274 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-certs\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.410454 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-node-bootstrap-token\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.411106 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c607a233-38be-49b0-9953-b6416f879c2e-metrics-tls\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.411288 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/398ed7a7-a832-4da9-bddb-45158a16cbd6-proxy-tls\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.412180 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5822cec-11d7-4d8f-a5cb-d78527689fe8-apiservice-cert\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.429857 4803 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.446189 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.493365 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2sg2\" (UniqueName: \"kubernetes.io/projected/c9636dcd-d0c5-4b44-96ed-aa4230163735-kube-api-access-s2sg2\") pod \"controller-manager-879f6c89f-hlq9h\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.512405 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xft7v\" (UniqueName: \"kubernetes.io/projected/c70da95d-2534-4ca4-8f3b-a8e2b6c09700-kube-api-access-xft7v\") pod \"machine-approver-56656f9798-tqcns\" (UID: \"c70da95d-2534-4ca4-8f3b-a8e2b6c09700\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.536921 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drvn5\" (UniqueName: \"kubernetes.io/projected/374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3-kube-api-access-drvn5\") pod \"downloads-7954f5f757-mm9jg\" (UID: \"374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3\") " pod="openshift-console/downloads-7954f5f757-mm9jg" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.538005 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.550054 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dttmf\" (UniqueName: \"kubernetes.io/projected/04eadf8a-430c-40c0-af91-f7bc1e02f220-kube-api-access-dttmf\") pod \"console-operator-58897d9998-28ntx\" (UID: \"04eadf8a-430c-40c0-af91-f7bc1e02f220\") " pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.552342 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mm9jg" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.564045 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srlr4\" (UniqueName: \"kubernetes.io/projected/67c363d9-78c9-43a3-9189-9fb19ed0b384-kube-api-access-srlr4\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.584902 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/67c363d9-78c9-43a3-9189-9fb19ed0b384-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rq88g\" (UID: \"67c363d9-78c9-43a3-9189-9fb19ed0b384\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.603576 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b92ec8cc-5e04-48de-b3e5-12a82c4a3df0-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qmjfg\" (UID: \"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.615442 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.627210 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.643792 4803 request.go:700] Waited for 1.949467761s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.645365 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.665502 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.684126 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.687700 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.707821 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.717625 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.720054 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.747861 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zszc6\" (UniqueName: \"kubernetes.io/projected/feb9a05b-2c34-4ad7-8316-e566b399a613-kube-api-access-zszc6\") pod \"dns-operator-744455d44c-x6qqk\" (UID: \"feb9a05b-2c34-4ad7-8316-e566b399a613\") " pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.758449 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-975zn\" (UniqueName: \"kubernetes.io/projected/cce69fc8-0b20-4f31-8a62-15f2f2292cdb-kube-api-access-975zn\") pod \"openshift-controller-manager-operator-756b6f6bc6-vbhsw\" (UID: \"cce69fc8-0b20-4f31-8a62-15f2f2292cdb\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.780175 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hlq9h"] Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.781807 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86hmf\" (UniqueName: \"kubernetes.io/projected/f3f47d35-b096-47cb-879d-05004b9cbcf4-kube-api-access-86hmf\") pod \"machine-api-operator-5694c8668f-ph6zk\" (UID: \"f3f47d35-b096-47cb-879d-05004b9cbcf4\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.838211 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.838678 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.842732 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv2pl\" (UniqueName: \"kubernetes.io/projected/353c5499-4d33-4657-8a0b-31abe59e5516-kube-api-access-qv2pl\") pod \"service-ca-operator-777779d784-8h6mv\" (UID: \"353c5499-4d33-4657-8a0b-31abe59e5516\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.845037 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:11 crc kubenswrapper[4803]: W0320 17:18:11.848322 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92ec8cc_5e04_48de_b3e5_12a82c4a3df0.slice/crio-6a9089113c687c37fdc475008079c3a4ee733c917c1a085765dba6cd554616c1 WatchSource:0}: Error finding container 6a9089113c687c37fdc475008079c3a4ee733c917c1a085765dba6cd554616c1: Status 404 returned error can't find the container with id 6a9089113c687c37fdc475008079c3a4ee733c917c1a085765dba6cd554616c1 Mar 20 17:18:11 crc kubenswrapper[4803]: W0320 17:18:11.862214 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9636dcd_d0c5_4b44_96ed_aa4230163735.slice/crio-98985f85e21b516f21d96173896b16c13ce34cbbb240df540ef00cb6e4005a0f WatchSource:0}: Error finding container 98985f85e21b516f21d96173896b16c13ce34cbbb240df540ef00cb6e4005a0f: Status 404 returned error can't find the container with id 98985f85e21b516f21d96173896b16c13ce34cbbb240df540ef00cb6e4005a0f Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.864160 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js4nl\" (UniqueName: \"kubernetes.io/projected/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-kube-api-access-js4nl\") pod \"console-f9d7485db-t6lj8\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.869518 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mm9jg"] Mar 20 17:18:11 crc kubenswrapper[4803]: E0320 17:18:11.869985 4803 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:11 crc kubenswrapper[4803]: E0320 17:18:11.870025 4803 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:11 crc kubenswrapper[4803]: E0320 17:18:11.870131 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 17:18:43.870095712 +0000 UTC m=+133.781687792 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.870489 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d2e71ab2-6214-4a0d-8745-2e5864a491b3-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.877362 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lvmd\" (UniqueName: \"kubernetes.io/projected/3c225286-1127-48e6-ae17-a55d8f21904e-kube-api-access-8lvmd\") pod \"apiserver-7bbb656c7d-hpjq8\" (UID: \"3c225286-1127-48e6-ae17-a55d8f21904e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.879555 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9399a9e8-58dd-4eca-ac01-a9d27dbb6e85-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gz427\" (UID: \"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.896975 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.898704 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6qkw\" (UniqueName: \"kubernetes.io/projected/b9b64128-c6d2-471f-84ac-84fb6b17ea78-kube-api-access-d6qkw\") pod \"cluster-samples-operator-665b6dd947-wt7sp\" (UID: \"b9b64128-c6d2-471f-84ac-84fb6b17ea78\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.907587 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.920074 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.922358 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-28ntx"] Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.925435 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w9kb\" (UniqueName: \"kubernetes.io/projected/d2e71ab2-6214-4a0d-8745-2e5864a491b3-kube-api-access-9w9kb\") pod \"cluster-image-registry-operator-dc59b4c8b-9tpc5\" (UID: \"d2e71ab2-6214-4a0d-8745-2e5864a491b3\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.936579 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.947637 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-s2tdl\" (UID: \"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:11 crc kubenswrapper[4803]: W0320 17:18:11.967329 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04eadf8a_430c_40c0_af91_f7bc1e02f220.slice/crio-21935333f86be3b256830214ab72bc922ec5c83a58ccf8ae8675381568ef60f7 WatchSource:0}: Error finding container 21935333f86be3b256830214ab72bc922ec5c83a58ccf8ae8675381568ef60f7: Status 404 returned error can't find the container with id 21935333f86be3b256830214ab72bc922ec5c83a58ccf8ae8675381568ef60f7 Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.983941 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g"] Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.984925 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbhs7\" (UniqueName: \"kubernetes.io/projected/0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa-kube-api-access-sbhs7\") pod \"etcd-operator-b45778765-zhp4m\" (UID: \"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:11 crc kubenswrapper[4803]: I0320 17:18:11.997194 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc46m\" (UniqueName: \"kubernetes.io/projected/784d72fc-b506-4908-9c03-0b696d082014-kube-api-access-nc46m\") pod \"openshift-apiserver-operator-796bbdcf4f-4569n\" (UID: \"784d72fc-b506-4908-9c03-0b696d082014\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.002588 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmrvd\" (UniqueName: \"kubernetes.io/projected/34933fd8-3d5a-4dbd-adf3-1ccb51d52d00-kube-api-access-pmrvd\") pod \"openshift-config-operator-7777fb866f-lrmdq\" (UID: \"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:12 crc kubenswrapper[4803]: W0320 17:18:12.003265 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67c363d9_78c9_43a3_9189_9fb19ed0b384.slice/crio-daadc3406e5e4b991c3a40db81962876540de314161726837ab83c87a9cb3a33 WatchSource:0}: Error finding container daadc3406e5e4b991c3a40db81962876540de314161726837ab83c87a9cb3a33: Status 404 returned error can't find the container with id daadc3406e5e4b991c3a40db81962876540de314161726837ab83c87a9cb3a33 Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.029596 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wtpv\" (UniqueName: \"kubernetes.io/projected/e60f36ac-4efd-493f-9903-a0311c9d6216-kube-api-access-9wtpv\") pod \"router-default-5444994796-r9r29\" (UID: \"e60f36ac-4efd-493f-9903-a0311c9d6216\") " pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.037252 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.042597 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8m96\" (UniqueName: \"kubernetes.io/projected/3ce102a5-845c-4e54-ba79-cbf4f76e3341-kube-api-access-v8m96\") pod \"authentication-operator-69f744f599-ddzzp\" (UID: \"3ce102a5-845c-4e54-ba79-cbf4f76e3341\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.067022 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pccqm\" (UniqueName: \"kubernetes.io/projected/f021b6de-4ba5-4116-a49f-d12a677f1746-kube-api-access-pccqm\") pod \"migrator-59844c95c7-xf7g5\" (UID: \"f021b6de-4ba5-4116-a49f-d12a677f1746\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.071680 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.081674 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f927b779-a5cc-48a1-a69d-3b39e82bd4ba-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-brtwx\" (UID: \"f927b779-a5cc-48a1-a69d-3b39e82bd4ba\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.091144 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.100379 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ph6zk"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.100870 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttwmw\" (UniqueName: \"kubernetes.io/projected/f13296fc-7b19-43e5-9f80-08502dee6f1b-kube-api-access-ttwmw\") pod \"control-plane-machine-set-operator-78cbb6b69f-k5nvk\" (UID: \"f13296fc-7b19-43e5-9f80-08502dee6f1b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.116443 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-x6qqk"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.120015 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftvg\" (UniqueName: \"kubernetes.io/projected/80befa86-f1dc-4d44-8d9a-7b50b557159d-kube-api-access-gftvg\") pod \"apiserver-76f77b778f-zzf4k\" (UID: \"80befa86-f1dc-4d44-8d9a-7b50b557159d\") " pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.122047 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:12 crc kubenswrapper[4803]: W0320 17:18:12.124181 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3f47d35_b096_47cb_879d_05004b9cbcf4.slice/crio-93c9d1f2b8f87eb7e248d11194bc86b4423e77e2c22c83eaf5dec62bac75b758 WatchSource:0}: Error finding container 93c9d1f2b8f87eb7e248d11194bc86b4423e77e2c22c83eaf5dec62bac75b758: Status 404 returned error can't find the container with id 93c9d1f2b8f87eb7e248d11194bc86b4423e77e2c22c83eaf5dec62bac75b758 Mar 20 17:18:12 crc kubenswrapper[4803]: W0320 17:18:12.127492 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfeb9a05b_2c34_4ad7_8316_e566b399a613.slice/crio-10ec790da5949a2994f194c2f2b1a675bf6fe814689477adbad36b7f3547ba75 WatchSource:0}: Error finding container 10ec790da5949a2994f194c2f2b1a675bf6fe814689477adbad36b7f3547ba75: Status 404 returned error can't find the container with id 10ec790da5949a2994f194c2f2b1a675bf6fe814689477adbad36b7f3547ba75 Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.130266 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.143663 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgql4\" (UniqueName: \"kubernetes.io/projected/baff85ab-57bf-49c5-8009-938ad47246aa-kube-api-access-tgql4\") pod \"oauth-openshift-558db77b4-tsv6s\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.145173 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.160954 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.161778 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rjn\" (UniqueName: \"kubernetes.io/projected/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-kube-api-access-88rjn\") pod \"route-controller-manager-6576b87f9c-86gtl\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.171850 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.189306 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.190974 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.207381 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.215590 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.224456 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.224945 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.228142 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.245273 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/56b68c7b-2d4d-4628-9f1f-85ec48141f82-metrics-certs\") pod \"network-metrics-daemon-llxn2\" (UID: \"56b68c7b-2d4d-4628-9f1f-85ec48141f82\") " pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.246964 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.268266 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.274615 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.281429 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.299588 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.300262 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.350121 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.350406 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hppjg\" (UniqueName: \"kubernetes.io/projected/d0e2699e-927c-4274-9bcd-f20d91af15e5-kube-api-access-hppjg\") pod \"csi-hostpathplugin-nsp5p\" (UID: \"d0e2699e-927c-4274-9bcd-f20d91af15e5\") " pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.362498 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vc4q\" (UniqueName: \"kubernetes.io/projected/50a15cf7-9fc3-45dc-a960-a85e930f8365-kube-api-access-4vc4q\") pod \"marketplace-operator-79b997595-g6tsw\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.366006 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9rtk\" (UniqueName: \"kubernetes.io/projected/bba569eb-cdfe-4d4e-911c-e9bdd9342ce5-kube-api-access-s9rtk\") pod \"kube-storage-version-migrator-operator-b67b599dd-s75tq\" (UID: \"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.368233 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-28ntx" event={"ID":"04eadf8a-430c-40c0-af91-f7bc1e02f220","Type":"ContainerStarted","Data":"9be6eb9051f46e5c93e4888e36db6eb19c4aab1d17478adbd2828b1e4ce31f24"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.368281 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-28ntx" event={"ID":"04eadf8a-430c-40c0-af91-f7bc1e02f220","Type":"ContainerStarted","Data":"21935333f86be3b256830214ab72bc922ec5c83a58ccf8ae8675381568ef60f7"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.368812 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.374880 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" event={"ID":"67c363d9-78c9-43a3-9189-9fb19ed0b384","Type":"ContainerStarted","Data":"daadc3406e5e4b991c3a40db81962876540de314161726837ab83c87a9cb3a33"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.376437 4803 patch_prober.go:28] interesting pod/console-operator-58897d9998-28ntx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.376494 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-28ntx" podUID="04eadf8a-430c-40c0-af91-f7bc1e02f220" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.377112 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" event={"ID":"c9636dcd-d0c5-4b44-96ed-aa4230163735","Type":"ContainerStarted","Data":"f01f0f9c8537248a1776b2db6051f6f0981d4cfd8dc742d678b97a0eee349e27"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.377137 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" event={"ID":"c9636dcd-d0c5-4b44-96ed-aa4230163735","Type":"ContainerStarted","Data":"98985f85e21b516f21d96173896b16c13ce34cbbb240df540ef00cb6e4005a0f"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.377652 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.378938 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-t6lj8"] Mar 20 17:18:12 crc kubenswrapper[4803]: W0320 17:18:12.382517 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode60f36ac_4efd_493f_9903_a0311c9d6216.slice/crio-e419c7ad9c990aaa98f83242958a4a5e70951b06ec02cfbd7ece968b4a8ef0b4 WatchSource:0}: Error finding container e419c7ad9c990aaa98f83242958a4a5e70951b06ec02cfbd7ece968b4a8ef0b4: Status 404 returned error can't find the container with id e419c7ad9c990aaa98f83242958a4a5e70951b06ec02cfbd7ece968b4a8ef0b4 Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.382610 4803 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hlq9h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.382672 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" podUID="c9636dcd-d0c5-4b44-96ed-aa4230163735" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.390177 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mm9jg" event={"ID":"374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3","Type":"ContainerStarted","Data":"89bda0a7ef19383eada42c1d6d2f40089fd801142c34296d6a7945b77854f44e"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.390233 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mm9jg" event={"ID":"374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3","Type":"ContainerStarted","Data":"2c3557536f1260ad77d80fa2009b06072f4c39bf96065668f9735e35b640317a"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.390557 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm9cg\" (UniqueName: \"kubernetes.io/projected/c5822cec-11d7-4d8f-a5cb-d78527689fe8-kube-api-access-cm9cg\") pod \"packageserver-d55dfcdfc-v5h9f\" (UID: \"c5822cec-11d7-4d8f-a5cb-d78527689fe8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.390928 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mm9jg" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.400044 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ztqg\" (UniqueName: \"kubernetes.io/projected/96658fb9-4742-457e-b7ec-384ef06ec6a8-kube-api-access-7ztqg\") pod \"auto-csr-approver-29567118-clg6s\" (UID: \"96658fb9-4742-457e-b7ec-384ef06ec6a8\") " pod="openshift-infra/auto-csr-approver-29567118-clg6s" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.401623 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" event={"ID":"cce69fc8-0b20-4f31-8a62-15f2f2292cdb","Type":"ContainerStarted","Data":"058ceac50d4df622621b818f95ed6dc2af11f5c53a48e64b55879ba87dd0f35e"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.404042 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.404081 4803 patch_prober.go:28] interesting pod/downloads-7954f5f757-mm9jg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.404134 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mm9jg" podUID="374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.429610 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" event={"ID":"f3f47d35-b096-47cb-879d-05004b9cbcf4","Type":"ContainerStarted","Data":"93c9d1f2b8f87eb7e248d11194bc86b4423e77e2c22c83eaf5dec62bac75b758"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.435382 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" event={"ID":"feb9a05b-2c34-4ad7-8316-e566b399a613","Type":"ContainerStarted","Data":"10ec790da5949a2994f194c2f2b1a675bf6fe814689477adbad36b7f3547ba75"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.439336 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.439506 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.443354 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxsgd\" (UniqueName: \"kubernetes.io/projected/c607a233-38be-49b0-9953-b6416f879c2e-kube-api-access-lxsgd\") pod \"dns-default-bxk24\" (UID: \"c607a233-38be-49b0-9953-b6416f879c2e\") " pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.455708 4803 generic.go:334] "Generic (PLEG): container finished" podID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" containerID="115ec5781af7a322c6bb97ca2517e80c511091904d4eeb6264fd2ea1e7f32fa1" exitCode=0 Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.455789 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerDied","Data":"115ec5781af7a322c6bb97ca2517e80c511091904d4eeb6264fd2ea1e7f32fa1"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.458343 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fwg2\" (UniqueName: \"kubernetes.io/projected/398ed7a7-a832-4da9-bddb-45158a16cbd6-kube-api-access-5fwg2\") pod \"machine-config-operator-74547568cd-xzzxc\" (UID: \"398ed7a7-a832-4da9-bddb-45158a16cbd6\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.463644 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" event={"ID":"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0","Type":"ContainerStarted","Data":"299c08f22f52c1ab6ae5a00cf93a009a9de6eaff2d4c403079d0e4de62fa78bd"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.463686 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" event={"ID":"b92ec8cc-5e04-48de-b3e5-12a82c4a3df0","Type":"ContainerStarted","Data":"6a9089113c687c37fdc475008079c3a4ee733c917c1a085765dba6cd554616c1"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.467783 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-llxn2" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.468657 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" event={"ID":"c70da95d-2534-4ca4-8f3b-a8e2b6c09700","Type":"ContainerStarted","Data":"8ed06066021d8b8a0be7d8fda0bebae4cdc49c5d9187a12fd821d13b572540fc"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.468701 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" event={"ID":"c70da95d-2534-4ca4-8f3b-a8e2b6c09700","Type":"ContainerStarted","Data":"4ba7bd6805afa7f1c705557ad01c73ecbf9296f000b22a76b7e536b7336c6056"} Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.469276 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv25f\" (UniqueName: \"kubernetes.io/projected/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-kube-api-access-cv25f\") pod \"collect-profiles-29567115-f7sgg\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.488689 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl858\" (UniqueName: \"kubernetes.io/projected/ea5c4417-cb46-4e98-be4f-473a658bd123-kube-api-access-kl858\") pod \"multus-admission-controller-857f4d67dd-rb6js\" (UID: \"ea5c4417-cb46-4e98-be4f-473a658bd123\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.510822 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg7z6\" (UniqueName: \"kubernetes.io/projected/a4b81c15-7f14-40fe-bfa1-49c6514ff28d-kube-api-access-pg7z6\") pod \"machine-config-server-97tz2\" (UID: \"a4b81c15-7f14-40fe-bfa1-49c6514ff28d\") " pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.524657 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqcn4\" (UniqueName: \"kubernetes.io/projected/b0606ebf-e436-487f-afb4-bbf4eee9a7c2-kube-api-access-mqcn4\") pod \"machine-config-controller-84d6567774-mjtmt\" (UID: \"b0606ebf-e436-487f-afb4-bbf4eee9a7c2\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.545491 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.550317 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnw6h\" (UniqueName: \"kubernetes.io/projected/f4999b37-8783-4029-b4a5-7b8aa468e234-kube-api-access-dnw6h\") pod \"catalog-operator-68c6474976-2b5vf\" (UID: \"f4999b37-8783-4029-b4a5-7b8aa468e234\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.557028 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.562949 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.625485 4803 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.627817 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567118-clg6s" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.633097 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.648112 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671260 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49663406-8493-44f5-8778-f177379037b0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671295 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-certificates\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671317 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671363 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671405 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-trusted-ca\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671489 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-bound-sa-token\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671510 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49663406-8493-44f5-8778-f177379037b0-srv-cert\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671586 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.671623 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-signing-key\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.679503 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" Mar 20 17:18:12 crc kubenswrapper[4803]: E0320 17:18:12.679522 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.179502896 +0000 UTC m=+103.091094966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.680475 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-signing-cabundle\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.680697 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdjr\" (UniqueName: \"kubernetes.io/projected/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-kube-api-access-9zdjr\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.680716 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrrpj\" (UniqueName: \"kubernetes.io/projected/57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289-kube-api-access-qrrpj\") pod \"package-server-manager-789f6589d5-zxfx7\" (UID: \"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.681788 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfk67\" (UniqueName: \"kubernetes.io/projected/49663406-8493-44f5-8778-f177379037b0-kube-api-access-rfk67\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.681844 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zxfx7\" (UID: \"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.681880 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2jn6\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-kube-api-access-z2jn6\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.681909 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-tls\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.687400 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.696729 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.704432 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.711206 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-97tz2" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.712603 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.715357 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.729694 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.753476 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-ddzzp"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.782999 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:12 crc kubenswrapper[4803]: E0320 17:18:12.783215 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.28318884 +0000 UTC m=+103.194780900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.783300 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-trusted-ca\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.783441 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9qgd\" (UniqueName: \"kubernetes.io/projected/84abd826-e5c1-4868-920c-10986d5e840c-kube-api-access-p9qgd\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.783642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84abd826-e5c1-4868-920c-10986d5e840c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.783770 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-bound-sa-token\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.783893 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49663406-8493-44f5-8778-f177379037b0-srv-cert\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784009 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m72hk\" (UniqueName: \"kubernetes.io/projected/a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150-kube-api-access-m72hk\") pod \"ingress-canary-9vjd5\" (UID: \"a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150\") " pod="openshift-ingress-canary/ingress-canary-9vjd5" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784025 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150-cert\") pod \"ingress-canary-9vjd5\" (UID: \"a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150\") " pod="openshift-ingress-canary/ingress-canary-9vjd5" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784146 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784190 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/84abd826-e5c1-4868-920c-10986d5e840c-ready\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784225 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-signing-key\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784282 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-signing-cabundle\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784309 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdjr\" (UniqueName: \"kubernetes.io/projected/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-kube-api-access-9zdjr\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784364 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrrpj\" (UniqueName: \"kubernetes.io/projected/57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289-kube-api-access-qrrpj\") pod \"package-server-manager-789f6589d5-zxfx7\" (UID: \"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784564 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfk67\" (UniqueName: \"kubernetes.io/projected/49663406-8493-44f5-8778-f177379037b0-kube-api-access-rfk67\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784584 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zxfx7\" (UID: \"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784611 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2jn6\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-kube-api-access-z2jn6\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784637 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-tls\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.784897 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84abd826-e5c1-4868-920c-10986d5e840c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.792984 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49663406-8493-44f5-8778-f177379037b0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.793181 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.793235 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-certificates\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.793361 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: E0320 17:18:12.794592 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.294567615 +0000 UTC m=+103.206159685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.808955 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.830124 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-trusted-ca\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.835161 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.836288 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-certificates\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.846942 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-signing-cabundle\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.850388 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-tls\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.851137 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zxfx7\" (UID: \"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.855514 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.876109 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-signing-key\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.876642 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49663406-8493-44f5-8778-f177379037b0-srv-cert\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.877576 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49663406-8493-44f5-8778-f177379037b0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.888098 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfk67\" (UniqueName: \"kubernetes.io/projected/49663406-8493-44f5-8778-f177379037b0-kube-api-access-rfk67\") pod \"olm-operator-6b444d44fb-wlnjj\" (UID: \"49663406-8493-44f5-8778-f177379037b0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.894390 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2jn6\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-kube-api-access-z2jn6\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.911582 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdjr\" (UniqueName: \"kubernetes.io/projected/1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09-kube-api-access-9zdjr\") pod \"service-ca-9c57cc56f-hsljx\" (UID: \"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09\") " pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.966647 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.966863 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84abd826-e5c1-4868-920c-10986d5e840c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.966921 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9qgd\" (UniqueName: \"kubernetes.io/projected/84abd826-e5c1-4868-920c-10986d5e840c-kube-api-access-p9qgd\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.966943 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84abd826-e5c1-4868-920c-10986d5e840c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.966989 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m72hk\" (UniqueName: \"kubernetes.io/projected/a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150-kube-api-access-m72hk\") pod \"ingress-canary-9vjd5\" (UID: \"a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150\") " pod="openshift-ingress-canary/ingress-canary-9vjd5" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.967008 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150-cert\") pod \"ingress-canary-9vjd5\" (UID: \"a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150\") " pod="openshift-ingress-canary/ingress-canary-9vjd5" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.967033 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/84abd826-e5c1-4868-920c-10986d5e840c-ready\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.968111 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/84abd826-e5c1-4868-920c-10986d5e840c-ready\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: E0320 17:18:12.968215 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.468198062 +0000 UTC m=+103.379790132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.969677 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84abd826-e5c1-4868-920c-10986d5e840c-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.969690 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-bound-sa-token\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.969790 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84abd826-e5c1-4868-920c-10986d5e840c-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.980592 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150-cert\") pod \"ingress-canary-9vjd5\" (UID: \"a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150\") " pod="openshift-ingress-canary/ingress-canary-9vjd5" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.982706 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.992049 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.992373 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-zhp4m"] Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.992109 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrrpj\" (UniqueName: \"kubernetes.io/projected/57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289-kube-api-access-qrrpj\") pod \"package-server-manager-789f6589d5-zxfx7\" (UID: \"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:12 crc kubenswrapper[4803]: I0320 17:18:12.992686 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9qgd\" (UniqueName: \"kubernetes.io/projected/84abd826-e5c1-4868-920c-10986d5e840c-kube-api-access-p9qgd\") pod \"cni-sysctl-allowlist-ds-6s6cl\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.008086 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.016440 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m72hk\" (UniqueName: \"kubernetes.io/projected/a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150-kube-api-access-m72hk\") pod \"ingress-canary-9vjd5\" (UID: \"a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150\") " pod="openshift-ingress-canary/ingress-canary-9vjd5" Mar 20 17:18:13 crc kubenswrapper[4803]: W0320 17:18:13.027381 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b66be4d_95d8_41ce_9ca2_ee7c2662e8fa.slice/crio-e4a2c752b270f1b633d0ac6eaaa6208716fd93fcce5c25596fc48c1140f947cc WatchSource:0}: Error finding container e4a2c752b270f1b633d0ac6eaaa6208716fd93fcce5c25596fc48c1140f947cc: Status 404 returned error can't find the container with id e4a2c752b270f1b633d0ac6eaaa6208716fd93fcce5c25596fc48c1140f947cc Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.047847 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.056138 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9vjd5" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.068633 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.070008 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.569986202 +0000 UTC m=+103.481578272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.079299 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-28ntx" podStartSLOduration=59.079279976 podStartE2EDuration="59.079279976s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:13.079271576 +0000 UTC m=+102.990863656" watchObservedRunningTime="2026-03-20 17:18:13.079279976 +0000 UTC m=+102.990872046" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.119657 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.122335 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsv6s"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.123469 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.141235 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.171082 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.171466 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.671450541 +0000 UTC m=+103.583042611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.173866 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-llxn2"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.265421 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.267299 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.273355 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.273696 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.773684646 +0000 UTC m=+103.685276716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.290491 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zzf4k"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.300441 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl"] Mar 20 17:18:13 crc kubenswrapper[4803]: W0320 17:18:13.338803 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf021b6de_4ba5_4116_a49f_d12a677f1746.slice/crio-1787fb0422af85c0779ef03dfad2738c03631e97f26096f3a9a6818790ad51e2 WatchSource:0}: Error finding container 1787fb0422af85c0779ef03dfad2738c03631e97f26096f3a9a6818790ad51e2: Status 404 returned error can't find the container with id 1787fb0422af85c0779ef03dfad2738c03631e97f26096f3a9a6818790ad51e2 Mar 20 17:18:13 crc kubenswrapper[4803]: W0320 17:18:13.342820 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf927b779_a5cc_48a1_a69d_3b39e82bd4ba.slice/crio-670b75740985ea695fa982a81ce874084ade8d23ebe3fe6ef2d02dfb37cb298f WatchSource:0}: Error finding container 670b75740985ea695fa982a81ce874084ade8d23ebe3fe6ef2d02dfb37cb298f: Status 404 returned error can't find the container with id 670b75740985ea695fa982a81ce874084ade8d23ebe3fe6ef2d02dfb37cb298f Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.354201 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g6tsw"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.374604 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.374733 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.874712331 +0000 UTC m=+103.786304401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.374876 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.375205 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.875193887 +0000 UTC m=+103.786785957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: W0320 17:18:13.380627 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11f2e60_24d7_44a3_bb32_f88bc05a50a8.slice/crio-b2a1e9a2307192f35a033ad64bd672f625a4e25b827d41ba9be64566594b4f50 WatchSource:0}: Error finding container b2a1e9a2307192f35a033ad64bd672f625a4e25b827d41ba9be64566594b4f50: Status 404 returned error can't find the container with id b2a1e9a2307192f35a033ad64bd672f625a4e25b827d41ba9be64566594b4f50 Mar 20 17:18:13 crc kubenswrapper[4803]: W0320 17:18:13.386691 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4b81c15_7f14_40fe_bfa1_49c6514ff28d.slice/crio-f4035f2657294be656bd328eead373bf2de308c83f156a245a7ce26f6f7b84b7 WatchSource:0}: Error finding container f4035f2657294be656bd328eead373bf2de308c83f156a245a7ce26f6f7b84b7: Status 404 returned error can't find the container with id f4035f2657294be656bd328eead373bf2de308c83f156a245a7ce26f6f7b84b7 Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.442405 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-nsp5p"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.475700 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.475846 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.975817476 +0000 UTC m=+103.887409556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.475898 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.476229 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:13.97621757 +0000 UTC m=+103.887809640 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.479925 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r9r29" event={"ID":"e60f36ac-4efd-493f-9903-a0311c9d6216","Type":"ContainerStarted","Data":"1dd6dc053ad9d2d7f36b2883a685cebba58c52ecbd7ac51f748f4bf87862f1d8"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.479964 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r9r29" event={"ID":"e60f36ac-4efd-493f-9903-a0311c9d6216","Type":"ContainerStarted","Data":"e419c7ad9c990aaa98f83242958a4a5e70951b06ec02cfbd7ece968b4a8ef0b4"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.481243 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" event={"ID":"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63","Type":"ContainerStarted","Data":"d8da723aaded19412508f847f2051d8ae01a47320be35b9b4d31cf87d14e8351"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.482538 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" event={"ID":"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa","Type":"ContainerStarted","Data":"e4a2c752b270f1b633d0ac6eaaa6208716fd93fcce5c25596fc48c1140f947cc"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.492116 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" event={"ID":"f11f2e60-24d7-44a3-bb32-f88bc05a50a8","Type":"ContainerStarted","Data":"b2a1e9a2307192f35a033ad64bd672f625a4e25b827d41ba9be64566594b4f50"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.494871 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" event={"ID":"784d72fc-b506-4908-9c03-0b696d082014","Type":"ContainerStarted","Data":"b21513507e173b5326ec38ca408a6139ae2a425e6263b18ec786bfb0533a9925"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.509136 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" event={"ID":"c70da95d-2534-4ca4-8f3b-a8e2b6c09700","Type":"ContainerStarted","Data":"ee2b5bf6f43e6098de57a1b30e4f6edca07360771aee47be8d4a9efb99351b00"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.517775 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qmjfg" podStartSLOduration=59.517761234 podStartE2EDuration="59.517761234s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:13.516943106 +0000 UTC m=+103.428535176" watchObservedRunningTime="2026-03-20 17:18:13.517761234 +0000 UTC m=+103.429353304" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.525530 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" event={"ID":"f021b6de-4ba5-4116-a49f-d12a677f1746","Type":"ContainerStarted","Data":"1787fb0422af85c0779ef03dfad2738c03631e97f26096f3a9a6818790ad51e2"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.532978 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t6lj8" event={"ID":"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52","Type":"ContainerStarted","Data":"bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.533014 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t6lj8" event={"ID":"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52","Type":"ContainerStarted","Data":"3c135f9804508c758fc21f2655615ced914b20fd28b1d8d439f3715a72cdae66"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.536581 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" event={"ID":"80befa86-f1dc-4d44-8d9a-7b50b557159d","Type":"ContainerStarted","Data":"c82ef6debd499c628cea0dcb77839b0e27072461dd0e5435cda77c6a5a8466f8"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.537889 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-97tz2" event={"ID":"a4b81c15-7f14-40fe-bfa1-49c6514ff28d","Type":"ContainerStarted","Data":"f4035f2657294be656bd328eead373bf2de308c83f156a245a7ce26f6f7b84b7"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.539303 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" event={"ID":"50a15cf7-9fc3-45dc-a960-a85e930f8365","Type":"ContainerStarted","Data":"75ff172ae9a57e498a5f8ed0422d73b3e9286926699d2e857815a6b6459caeee"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.553085 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.561937 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llxn2" event={"ID":"56b68c7b-2d4d-4628-9f1f-85ec48141f82","Type":"ContainerStarted","Data":"a8797c7826eb69845bdaa0810b218361d8d649993ab16d2070298e8f77354679"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.573394 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" event={"ID":"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00","Type":"ContainerStarted","Data":"a5b01d8ada6453c6e2f31ed06531949dfcba4418fa0652a158480f82d6517105"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.576767 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.576901 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.076880872 +0000 UTC m=+103.988472942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.576988 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.578213 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.078200506 +0000 UTC m=+103.989792576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.592593 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-rb6js"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.595711 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567118-clg6s"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.602483 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" event={"ID":"d2e71ab2-6214-4a0d-8745-2e5864a491b3","Type":"ContainerStarted","Data":"c5eccf421c7182eff7418d08b6b2e7bb97355502bd0ba0173531ed0af81279e8"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.603198 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.611238 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" event={"ID":"feb9a05b-2c34-4ad7-8316-e566b399a613","Type":"ContainerStarted","Data":"170bd51620b99cab7d96a801976a2a3823ee3d3c50c71bfdc91c0728eeda77a5"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.613554 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" event={"ID":"cce69fc8-0b20-4f31-8a62-15f2f2292cdb","Type":"ContainerStarted","Data":"09384f10b251b7d5282114c268a4edb8e30faceaf5b823b4573da44910299225"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.616821 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" event={"ID":"baff85ab-57bf-49c5-8009-938ad47246aa","Type":"ContainerStarted","Data":"2891369a309be11942ff09790fd616ec63f368564408f0939d12bbc223c395a8"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.617951 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.618152 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" event={"ID":"3c225286-1127-48e6-ae17-a55d8f21904e","Type":"ContainerStarted","Data":"5410574a97d26c3a42b43a75c5e00b65138f1d6298174060c7f8217856b4158d"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.620002 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" event={"ID":"3ce102a5-845c-4e54-ba79-cbf4f76e3341","Type":"ContainerStarted","Data":"41b90cc607702c92876a1c8886dd73d9b633e42f06f9c5b2119ec2573f19c910"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.621059 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" event={"ID":"f927b779-a5cc-48a1-a69d-3b39e82bd4ba","Type":"ContainerStarted","Data":"670b75740985ea695fa982a81ce874084ade8d23ebe3fe6ef2d02dfb37cb298f"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.626710 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" event={"ID":"353c5499-4d33-4657-8a0b-31abe59e5516","Type":"ContainerStarted","Data":"d5c5b2ba2411c413fea579988d36ac3642f40a67e5da0d00fab4c24cf74867dc"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.632902 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" event={"ID":"f13296fc-7b19-43e5-9f80-08502dee6f1b","Type":"ContainerStarted","Data":"85873317bbbce4799d02f75314487dadba665e2a29f51f30aa53ab946bb32ce9"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.643583 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.658257 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" event={"ID":"f3f47d35-b096-47cb-879d-05004b9cbcf4","Type":"ContainerStarted","Data":"c501bbc53905599716f19ff7dffb61f77b48364267450b1885f7cbf73b40c44b"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.658292 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" event={"ID":"f3f47d35-b096-47cb-879d-05004b9cbcf4","Type":"ContainerStarted","Data":"72947dbb0d8dbc0dddf69c70861493cb3a9969c6b30d2fea4a60a3bac79503a0"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.669722 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" event={"ID":"67c363d9-78c9-43a3-9189-9fb19ed0b384","Type":"ContainerStarted","Data":"e7fe7e94e82dae4ccde6ea96d54c1fac248a69f9090d3ecec31845f845d9b8e5"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.669770 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" event={"ID":"67c363d9-78c9-43a3-9189-9fb19ed0b384","Type":"ContainerStarted","Data":"b0f96eac72f4bddc7025295ee33cf2d6f63944d17ce64c79a231efecb8edcf90"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.680722 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.681332 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.181300861 +0000 UTC m=+104.092892931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.685921 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" podStartSLOduration=59.685901266 podStartE2EDuration="59.685901266s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:13.65168453 +0000 UTC m=+103.563276620" watchObservedRunningTime="2026-03-20 17:18:13.685901266 +0000 UTC m=+103.597493326" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.687076 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-bxk24"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.699150 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerStarted","Data":"97b1ad14c1e27767e07c2a833198d7f5b0166387bd0fc0d803ec332b19f68daf"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.705384 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.710062 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" event={"ID":"b9b64128-c6d2-471f-84ac-84fb6b17ea78","Type":"ContainerStarted","Data":"1c2e08e92dbdbb3c87ba5f90e0bca0c7595620c514a46e04234c63af9c1fe76e"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.710100 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" event={"ID":"b9b64128-c6d2-471f-84ac-84fb6b17ea78","Type":"ContainerStarted","Data":"3bae312659021869c3df2c54edbb40f97acbde41f27ab4a1fd6a8b1ed84e70c6"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.714853 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" event={"ID":"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85","Type":"ContainerStarted","Data":"6ea747d324c5927a01ceb89a220e7b7f9161c7c1772e96620b7863ff0bb868b2"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.714894 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" event={"ID":"9399a9e8-58dd-4eca-ac01-a9d27dbb6e85","Type":"ContainerStarted","Data":"a08ed287c4faf51e606c291428d4138948261c6de0ebe0a3ebcd4223000cb475"} Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.715954 4803 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-hlq9h container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.716003 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" podUID="c9636dcd-d0c5-4b44-96ed-aa4230163735" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.716772 4803 patch_prober.go:28] interesting pod/downloads-7954f5f757-mm9jg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.716836 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mm9jg" podUID="374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.717341 4803 patch_prober.go:28] interesting pod/console-operator-58897d9998-28ntx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.717407 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-28ntx" podUID="04eadf8a-430c-40c0-af91-f7bc1e02f220" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/readyz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.789877 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.807746 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.307727763 +0000 UTC m=+104.219319833 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.907626 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.907651 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.407629019 +0000 UTC m=+104.319221089 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.917100 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:13 crc kubenswrapper[4803]: E0320 17:18:13.917403 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.417391299 +0000 UTC m=+104.328983369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.940174 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj"] Mar 20 17:18:13 crc kubenswrapper[4803]: I0320 17:18:13.969712 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-hsljx"] Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.011665 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.019398 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.020073 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.520057559 +0000 UTC m=+104.431649619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: W0320 17:18:14.029957 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc607a233_38be_49b0_9953_b6416f879c2e.slice/crio-b41b08aec9622014d4ddd529b06973c32cee24520c229f3aaefeeb9893beebe6 WatchSource:0}: Error finding container b41b08aec9622014d4ddd529b06973c32cee24520c229f3aaefeeb9893beebe6: Status 404 returned error can't find the container with id b41b08aec9622014d4ddd529b06973c32cee24520c229f3aaefeeb9893beebe6 Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.038463 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9vjd5"] Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.121200 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.121660 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.621644592 +0000 UTC m=+104.533236662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.144074 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt"] Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.151323 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7"] Mar 20 17:18:14 crc kubenswrapper[4803]: W0320 17:18:14.214796 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57fa8f9a_6b19_4a5b_9fc5_a6b59e2bc289.slice/crio-991c74c61ebb2e0b935d29af89b1a84543b28f1c890b1b5928ce9d5caa43870a WatchSource:0}: Error finding container 991c74c61ebb2e0b935d29af89b1a84543b28f1c890b1b5928ce9d5caa43870a: Status 404 returned error can't find the container with id 991c74c61ebb2e0b935d29af89b1a84543b28f1c890b1b5928ce9d5caa43870a Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.221728 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.221947 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.72189397 +0000 UTC m=+104.633486030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.222052 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.222298 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.722290703 +0000 UTC m=+104.633882773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.284046 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.284851 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.284886 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.323550 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.324877 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.824840139 +0000 UTC m=+104.736432219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.335238 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.335688 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.835676095 +0000 UTC m=+104.747268165 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.436936 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.437090 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.937067982 +0000 UTC m=+104.848660052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.437544 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.438816 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:14.93880463 +0000 UTC m=+104.850396700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.538705 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.538947 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.038911183 +0000 UTC m=+104.950503253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.540308 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.541228 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.041207681 +0000 UTC m=+104.952799741 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.585710 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mm9jg" podStartSLOduration=60.585687034 podStartE2EDuration="1m0.585687034s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:14.584689991 +0000 UTC m=+104.496282071" watchObservedRunningTime="2026-03-20 17:18:14.585687034 +0000 UTC m=+104.497279104" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.598746 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40144: no serving certificate available for the kubelet" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.641656 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.642002 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.141988427 +0000 UTC m=+105.053580497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.737818 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40152: no serving certificate available for the kubelet" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.752927 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.753325 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.253309769 +0000 UTC m=+105.164901839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.782051 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ph6zk" podStartSLOduration=60.78202706 podStartE2EDuration="1m0.78202706s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:14.780632752 +0000 UTC m=+104.692224822" watchObservedRunningTime="2026-03-20 17:18:14.78202706 +0000 UTC m=+104.693619130" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.790472 4803 generic.go:334] "Generic (PLEG): container finished" podID="34933fd8-3d5a-4dbd-adf3-1ccb51d52d00" containerID="c4eb7a83435c8dfd338161574507d97a39627f1461609ee822b5051253da8d9d" exitCode=0 Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.790972 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" event={"ID":"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00","Type":"ContainerDied","Data":"c4eb7a83435c8dfd338161574507d97a39627f1461609ee822b5051253da8d9d"} Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.798513 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" event={"ID":"f021b6de-4ba5-4116-a49f-d12a677f1746","Type":"ContainerStarted","Data":"eab671b9a934e5ec135e48625771bc4cc65775f786a8ff92295b73c98101a2f0"} Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.810727 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40166: no serving certificate available for the kubelet" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.825363 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" event={"ID":"80befa86-f1dc-4d44-8d9a-7b50b557159d","Type":"ContainerStarted","Data":"eba302f462086101c2e6fcd07fdea42115f769251b0509732829287b1cf08225"} Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.853962 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.854059 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.354037243 +0000 UTC m=+105.265629313 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.854177 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.854442 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.354431296 +0000 UTC m=+105.266023366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.876499 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-t6lj8" podStartSLOduration=60.876479342 podStartE2EDuration="1m0.876479342s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:14.871142461 +0000 UTC m=+104.782734531" watchObservedRunningTime="2026-03-20 17:18:14.876479342 +0000 UTC m=+104.788071412" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.896473 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-tqcns" podStartSLOduration=61.896444286 podStartE2EDuration="1m1.896444286s" podCreationTimestamp="2026-03-20 17:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:14.896277701 +0000 UTC m=+104.807869771" watchObservedRunningTime="2026-03-20 17:18:14.896444286 +0000 UTC m=+104.808036356" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.932388 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" event={"ID":"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09","Type":"ContainerStarted","Data":"158b16bb7c1bd8540226076bc7b0469549de21de61f4d1d7a60ba231374b75d3"} Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.932459 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" event={"ID":"4f0ce4e3-ee25-4af4-b2ae-f7022f65fa63","Type":"ContainerStarted","Data":"619c1915866c9d910e8c4e1336d8fd66ace3297cc9d952889938f745907dd3a3"} Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.932937 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40182: no serving certificate available for the kubelet" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.941753 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-97tz2" event={"ID":"a4b81c15-7f14-40fe-bfa1-49c6514ff28d","Type":"ContainerStarted","Data":"dfff664d860bb1b0a67ff99353e90306d58af3421c41973bf4f7e8e954543f19"} Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.954845 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:14 crc kubenswrapper[4803]: E0320 17:18:14.958949 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.458897497 +0000 UTC m=+105.370489567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.974267 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vbhsw" podStartSLOduration=60.974243335 podStartE2EDuration="1m0.974243335s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:14.951030711 +0000 UTC m=+104.862622791" watchObservedRunningTime="2026-03-20 17:18:14.974243335 +0000 UTC m=+104.885835405" Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.977205 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" event={"ID":"353c5499-4d33-4657-8a0b-31abe59e5516","Type":"ContainerStarted","Data":"f7750c1ed13e77fe8b9796d4aaaf66101c19b701646c19bddc67caf3ef14b133"} Mar 20 17:18:14 crc kubenswrapper[4803]: I0320 17:18:14.995082 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" event={"ID":"d2e71ab2-6214-4a0d-8745-2e5864a491b3","Type":"ContainerStarted","Data":"be54cf17626893023000ea65c8950b3b9a221fbd2ae67c6a8d15dbd8c9fad70b"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.002937 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r9r29" podStartSLOduration=61.002914694 podStartE2EDuration="1m1.002914694s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:14.993636121 +0000 UTC m=+104.905228191" watchObservedRunningTime="2026-03-20 17:18:15.002914694 +0000 UTC m=+104.914506764" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.003924 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" event={"ID":"398ed7a7-a832-4da9-bddb-45158a16cbd6","Type":"ContainerStarted","Data":"6aace847ca46c82fe59aaa40fe6d36d45689619b962134d42077c0825c2c762d"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.025330 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40188: no serving certificate available for the kubelet" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.027271 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rq88g" podStartSLOduration=61.027252867 podStartE2EDuration="1m1.027252867s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.025847609 +0000 UTC m=+104.937439699" watchObservedRunningTime="2026-03-20 17:18:15.027252867 +0000 UTC m=+104.938844937" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.033362 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" event={"ID":"49663406-8493-44f5-8778-f177379037b0","Type":"ContainerStarted","Data":"8a3f6f861db22dbb35018d9f14b1dd5ed2c2a79eebcdb5fe8eab7651c74012ab"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.035317 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" event={"ID":"0b66be4d-95d8-41ce-9ca2-ee7c2662e8fa","Type":"ContainerStarted","Data":"894b07043896e24296b4afc3ec6457a72c3b86200b4f964bec2de87e66bb2f9a"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.069251 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.069573 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.569561267 +0000 UTC m=+105.481153337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.075932 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gz427" podStartSLOduration=61.075919032 podStartE2EDuration="1m1.075919032s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.075726405 +0000 UTC m=+104.987318485" watchObservedRunningTime="2026-03-20 17:18:15.075919032 +0000 UTC m=+104.987511102" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.079618 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" event={"ID":"c5822cec-11d7-4d8f-a5cb-d78527689fe8","Type":"ContainerStarted","Data":"1034dbac116d136700fa6b63c103fee94a2ec7792e043ec7bea69f3599f06281"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.091686 4803 generic.go:334] "Generic (PLEG): container finished" podID="3c225286-1127-48e6-ae17-a55d8f21904e" containerID="5899700bf65890830b714bfb3ac049c0fc88fe2e5a069df1bfd0779ba6b7aabc" exitCode=0 Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.092296 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" event={"ID":"3c225286-1127-48e6-ae17-a55d8f21904e","Type":"ContainerDied","Data":"5899700bf65890830b714bfb3ac049c0fc88fe2e5a069df1bfd0779ba6b7aabc"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.137873 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9vjd5" event={"ID":"a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150","Type":"ContainerStarted","Data":"1eb9ae2c39cae4e23c6a22625cf0ec1a1de46b780135d9932a26b8200be578a5"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.157365 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" event={"ID":"3ce102a5-845c-4e54-ba79-cbf4f76e3341","Type":"ContainerStarted","Data":"325f3cfe05539d8955348713e2444083c6d044c6e06ea09a3dc2d72f2511a13b"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.161501 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-zhp4m" podStartSLOduration=61.161483703 podStartE2EDuration="1m1.161483703s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.159536207 +0000 UTC m=+105.071128277" watchObservedRunningTime="2026-03-20 17:18:15.161483703 +0000 UTC m=+105.073075773" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.170312 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.170416 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.670400005 +0000 UTC m=+105.581992075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.170598 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.171750 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.67174179 +0000 UTC m=+105.583333860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.206811 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40204: no serving certificate available for the kubelet" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.207999 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-s2tdl" podStartSLOduration=61.207983675 podStartE2EDuration="1m1.207983675s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.207027412 +0000 UTC m=+105.118619482" watchObservedRunningTime="2026-03-20 17:18:15.207983675 +0000 UTC m=+105.119575745" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.208413 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" event={"ID":"84abd826-e5c1-4868-920c-10986d5e840c","Type":"ContainerStarted","Data":"9c636b04ed7c5ad82eea07c89829e53a12b5656c6fbf0557de3115bc7c9d8736"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.259072 4803 generic.go:334] "Generic (PLEG): container finished" podID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" containerID="97b1ad14c1e27767e07c2a833198d7f5b0166387bd0fc0d803ec332b19f68daf" exitCode=0 Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.259161 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerDied","Data":"97b1ad14c1e27767e07c2a833198d7f5b0166387bd0fc0d803ec332b19f68daf"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.276245 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.277399 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.77738006 +0000 UTC m=+105.688972130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.295904 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567118-clg6s" event={"ID":"96658fb9-4742-457e-b7ec-384ef06ec6a8","Type":"ContainerStarted","Data":"ad2a6326995dec5239845e2a6a52239d479bbc5184e792b77908d867220f9e5f"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.302864 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:15 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:15 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:15 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.302907 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.339266 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" event={"ID":"d0e2699e-927c-4274-9bcd-f20d91af15e5","Type":"ContainerStarted","Data":"fc620a7f61394a2e91b57601d4a45d0ac4fca473afb730de1d06268c054a9db5"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.343814 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-97tz2" podStartSLOduration=6.343791614 podStartE2EDuration="6.343791614s" podCreationTimestamp="2026-03-20 17:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.248277826 +0000 UTC m=+105.159869896" watchObservedRunningTime="2026-03-20 17:18:15.343791614 +0000 UTC m=+105.255383684" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.344651 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-9tpc5" podStartSLOduration=61.344644843 podStartE2EDuration="1m1.344644843s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.33892367 +0000 UTC m=+105.250515740" watchObservedRunningTime="2026-03-20 17:18:15.344644843 +0000 UTC m=+105.256236913" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.363072 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" event={"ID":"50a15cf7-9fc3-45dc-a960-a85e930f8365","Type":"ContainerStarted","Data":"8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.364708 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.369987 4803 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-g6tsw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.370059 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" podUID="50a15cf7-9fc3-45dc-a960-a85e930f8365" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.380344 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.381420 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.881408186 +0000 UTC m=+105.793000256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.385969 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" event={"ID":"b9b64128-c6d2-471f-84ac-84fb6b17ea78","Type":"ContainerStarted","Data":"bd29575ff5c47ac7f37bdfdd55872185856eb015ed6d718ce7637762fb692bea"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.398564 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-8h6mv" podStartSLOduration=61.398545765 podStartE2EDuration="1m1.398545765s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.397929434 +0000 UTC m=+105.309521514" watchObservedRunningTime="2026-03-20 17:18:15.398545765 +0000 UTC m=+105.310137825" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.410051 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" event={"ID":"f4999b37-8783-4029-b4a5-7b8aa468e234","Type":"ContainerStarted","Data":"c560d8177e36a9836b80882478d9ac8ce79959b246cf66e10f5d97e70436fbe5"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.410991 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.423758 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" event={"ID":"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5","Type":"ContainerStarted","Data":"aca0db9a12f27f6119ee95150415040caaee60d88c27d8b807915647370dd851"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.430642 4803 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2b5vf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.430708 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" podUID="f4999b37-8783-4029-b4a5-7b8aa468e234" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.464831 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" event={"ID":"784d72fc-b506-4908-9c03-0b696d082014","Type":"ContainerStarted","Data":"63570220dabee746f30591932177972d7ef45a1055a8abab26dcf8cc943327b0"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.480757 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" event={"ID":"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289","Type":"ContainerStarted","Data":"991c74c61ebb2e0b935d29af89b1a84543b28f1c890b1b5928ce9d5caa43870a"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.481919 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.482687 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:15.982667458 +0000 UTC m=+105.894259528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.488931 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40218: no serving certificate available for the kubelet" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.496222 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" event={"ID":"f927b779-a5cc-48a1-a69d-3b39e82bd4ba","Type":"ContainerStarted","Data":"c55132e4348861e74f9700aebc728e5946aa13f9654f07a9bd3543baee1bc614"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.558492 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" event={"ID":"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a","Type":"ContainerStarted","Data":"e78a976bab2a67efa091d6ccbdda747f58e1630ad2b658f2a7afd2b83d5bafae"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.568931 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" event={"ID":"ea5c4417-cb46-4e98-be4f-473a658bd123","Type":"ContainerStarted","Data":"be6de6165acf7cc0b58a957491bba064c71817c59fc109af633a79fb4020c937"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.583693 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.584908 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.084895992 +0000 UTC m=+105.996488062 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.596765 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" event={"ID":"f13296fc-7b19-43e5-9f80-08502dee6f1b","Type":"ContainerStarted","Data":"57b2d184c3f5b1ad8c2184b77d620d98da13c9f90c38419d01116b359cd3b795"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.599818 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" event={"ID":"feb9a05b-2c34-4ad7-8316-e566b399a613","Type":"ContainerStarted","Data":"bd3364884fcf120038d664e4dbb0d47f1d1b7940e1f544fb75807d493088b8d6"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.662117 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxk24" event={"ID":"c607a233-38be-49b0-9953-b6416f879c2e","Type":"ContainerStarted","Data":"b41b08aec9622014d4ddd529b06973c32cee24520c229f3aaefeeb9893beebe6"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.676342 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" podStartSLOduration=61.676324102 podStartE2EDuration="1m1.676324102s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.610138145 +0000 UTC m=+105.521730215" watchObservedRunningTime="2026-03-20 17:18:15.676324102 +0000 UTC m=+105.587916172" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.688165 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.689451 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.189436605 +0000 UTC m=+106.101028675 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.718863 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" event={"ID":"b0606ebf-e436-487f-afb4-bbf4eee9a7c2","Type":"ContainerStarted","Data":"add797b0696aa22caf003fb44b4b6dc590bf7583d6bdabd77da838865b9aff79"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.760723 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-ddzzp" podStartSLOduration=62.760708884 podStartE2EDuration="1m2.760708884s" podCreationTimestamp="2026-03-20 17:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.758737037 +0000 UTC m=+105.670329107" watchObservedRunningTime="2026-03-20 17:18:15.760708884 +0000 UTC m=+105.672300944" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.797413 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wt7sp" podStartSLOduration=61.797400374 podStartE2EDuration="1m1.797400374s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.796896107 +0000 UTC m=+105.708488177" watchObservedRunningTime="2026-03-20 17:18:15.797400374 +0000 UTC m=+105.708992444" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.807072 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.807367 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.307356 +0000 UTC m=+106.218948070 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.809986 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llxn2" event={"ID":"56b68c7b-2d4d-4628-9f1f-85ec48141f82","Type":"ContainerStarted","Data":"22254a39f42156bfd33f2d33609c2b1e60c45ddd8f9ca65d50f6f7214d3ac80a"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.834427 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" podStartSLOduration=61.834413045 podStartE2EDuration="1m1.834413045s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.834215288 +0000 UTC m=+105.745807368" watchObservedRunningTime="2026-03-20 17:18:15.834413045 +0000 UTC m=+105.746005115" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.835151 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" event={"ID":"baff85ab-57bf-49c5-8009-938ad47246aa","Type":"ContainerStarted","Data":"adc48f9207c753a1ebbfcae066f5e56440c200be25054b311a061c49711ac3a1"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.835428 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.846714 4803 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tsv6s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" start-of-body= Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.846774 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" podUID="baff85ab-57bf-49c5-8009-938ad47246aa" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": dial tcp 10.217.0.21:6443: connect: connection refused" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.871327 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" event={"ID":"f11f2e60-24d7-44a3-bb32-f88bc05a50a8","Type":"ContainerStarted","Data":"478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83"} Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.871360 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.872995 4803 patch_prober.go:28] interesting pod/downloads-7954f5f757-mm9jg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.873023 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mm9jg" podUID="374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.887503 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4569n" podStartSLOduration=62.887488918 podStartE2EDuration="1m2.887488918s" podCreationTimestamp="2026-03-20 17:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.885699498 +0000 UTC m=+105.797291578" watchObservedRunningTime="2026-03-20 17:18:15.887488918 +0000 UTC m=+105.799080988" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.887918 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40226: no serving certificate available for the kubelet" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.895543 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.908400 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:15 crc kubenswrapper[4803]: E0320 17:18:15.910307 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.410289149 +0000 UTC m=+106.321881219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.935268 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-28ntx" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.938549 4803 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-86gtl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.938587 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" podUID="f11f2e60-24d7-44a3-bb32-f88bc05a50a8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.939006 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-brtwx" podStartSLOduration=61.938995899 podStartE2EDuration="1m1.938995899s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.935631225 +0000 UTC m=+105.847223305" watchObservedRunningTime="2026-03-20 17:18:15.938995899 +0000 UTC m=+105.850587969" Mar 20 17:18:15 crc kubenswrapper[4803]: I0320 17:18:15.983224 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-x6qqk" podStartSLOduration=61.983203253 podStartE2EDuration="1m1.983203253s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:15.982757208 +0000 UTC m=+105.894349288" watchObservedRunningTime="2026-03-20 17:18:15.983203253 +0000 UTC m=+105.894795323" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.019870 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.020407 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.52039152 +0000 UTC m=+106.431983590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.042839 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k5nvk" podStartSLOduration=62.042808898 podStartE2EDuration="1m2.042808898s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:16.041021697 +0000 UTC m=+105.952613767" watchObservedRunningTime="2026-03-20 17:18:16.042808898 +0000 UTC m=+105.954400978" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.117207 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" podStartSLOduration=62.117176691 podStartE2EDuration="1m2.117176691s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:16.110017859 +0000 UTC m=+106.021609949" watchObservedRunningTime="2026-03-20 17:18:16.117176691 +0000 UTC m=+106.028768761" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.123026 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.123155 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.623134032 +0000 UTC m=+106.534726102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.123485 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.124034 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.624022542 +0000 UTC m=+106.535614612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.151645 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" podStartSLOduration=63.151624305 podStartE2EDuration="1m3.151624305s" podCreationTimestamp="2026-03-20 17:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:16.151286674 +0000 UTC m=+106.062878744" watchObservedRunningTime="2026-03-20 17:18:16.151624305 +0000 UTC m=+106.063216375" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.172357 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" podStartSLOduration=62.172328845 podStartE2EDuration="1m2.172328845s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:16.170091209 +0000 UTC m=+106.081683289" watchObservedRunningTime="2026-03-20 17:18:16.172328845 +0000 UTC m=+106.083920915" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.234242 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.234869 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.734832087 +0000 UTC m=+106.646424157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.287851 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:16 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:16 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:16 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.287907 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.335384 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.336136 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.83612242 +0000 UTC m=+106.747714480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.437222 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.437596 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:16.937580799 +0000 UTC m=+106.849172869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.539114 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.539408 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.03939662 +0000 UTC m=+106.950988690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.613159 4803 ???:1] "http: TLS handshake error from 192.168.126.11:40240: no serving certificate available for the kubelet" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.641147 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.641425 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.141381686 +0000 UTC m=+107.052973756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.641510 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.641987 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.141979036 +0000 UTC m=+107.053571106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.742957 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.743160 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.243127725 +0000 UTC m=+107.154719795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.743753 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.744134 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.244125978 +0000 UTC m=+107.155718048 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.846124 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.846357 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.346326082 +0000 UTC m=+107.257918152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.846652 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.847077 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.347060837 +0000 UTC m=+107.258652907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.893414 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" event={"ID":"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289","Type":"ContainerStarted","Data":"178c5794a4b61f36c9a0880be83591e83ae398869038b8b5dbd2a031673747a9"} Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.893475 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" event={"ID":"57fa8f9a-6b19-4a5b-9fc5-a6b59e2bc289","Type":"ContainerStarted","Data":"f1cec57a45bf6809a7674566497979ae664132ce80212a7d845c2da6e7964c14"} Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.893840 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.914810 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" event={"ID":"b0606ebf-e436-487f-afb4-bbf4eee9a7c2","Type":"ContainerStarted","Data":"78a1fcdc12a4ca64d32e9d29846940a9c02e6908492f7705db66e59c19db273d"} Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.914862 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" event={"ID":"b0606ebf-e436-487f-afb4-bbf4eee9a7c2","Type":"ContainerStarted","Data":"1976db8ccdf601936d94b6f592303099b9a5d754c16b57fffe106384cc536f41"} Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.928743 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" podStartSLOduration=62.928715917 podStartE2EDuration="1m2.928715917s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:16.925323122 +0000 UTC m=+106.836915202" watchObservedRunningTime="2026-03-20 17:18:16.928715917 +0000 UTC m=+106.840307987" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.937274 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" event={"ID":"bba569eb-cdfe-4d4e-911c-e9bdd9342ce5","Type":"ContainerStarted","Data":"4858503d34a715550d4a27eecdd3e785908f7cf886d349abfb15c61b9437642a"} Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.948125 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:16 crc kubenswrapper[4803]: E0320 17:18:16.949360 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.449321473 +0000 UTC m=+107.360913543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.951104 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" event={"ID":"398ed7a7-a832-4da9-bddb-45158a16cbd6","Type":"ContainerStarted","Data":"687b02209a415bf4b690595e0f9ed3a57831d4a4fe14e7caa92781101071c289"} Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.951147 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" event={"ID":"398ed7a7-a832-4da9-bddb-45158a16cbd6","Type":"ContainerStarted","Data":"31ceabdb0935f3bf7f51e2b31057e586614122410c6f08a779fa660f40375da4"} Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.954689 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-mjtmt" podStartSLOduration=62.954661223 podStartE2EDuration="1m2.954661223s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:16.953300157 +0000 UTC m=+106.864892237" watchObservedRunningTime="2026-03-20 17:18:16.954661223 +0000 UTC m=+106.866253293" Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.966741 4803 generic.go:334] "Generic (PLEG): container finished" podID="80befa86-f1dc-4d44-8d9a-7b50b557159d" containerID="eba302f462086101c2e6fcd07fdea42115f769251b0509732829287b1cf08225" exitCode=0 Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.966928 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" event={"ID":"80befa86-f1dc-4d44-8d9a-7b50b557159d","Type":"ContainerDied","Data":"eba302f462086101c2e6fcd07fdea42115f769251b0509732829287b1cf08225"} Mar 20 17:18:16 crc kubenswrapper[4803]: I0320 17:18:16.975420 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" event={"ID":"d0e2699e-927c-4274-9bcd-f20d91af15e5","Type":"ContainerStarted","Data":"0f51535f848f981f9bcc813e1fe49aeb0bd66d5519d3ac70c62dec7f22841082"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.002860 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" event={"ID":"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a","Type":"ContainerStarted","Data":"acc2f987afb39df4999db0d2666d8d9e52c97902961a5c4f37dedb9b0bdb043e"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.024218 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" event={"ID":"84abd826-e5c1-4868-920c-10986d5e840c","Type":"ContainerStarted","Data":"19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.024950 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.041387 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" event={"ID":"f021b6de-4ba5-4116-a49f-d12a677f1746","Type":"ContainerStarted","Data":"3b9db70ba59d5f4ec4265ebebcf04642c005a27f513abb501a715e2dab108bfa"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.050759 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xzzxc" podStartSLOduration=63.050728669 podStartE2EDuration="1m3.050728669s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:16.988857568 +0000 UTC m=+106.900449638" watchObservedRunningTime="2026-03-20 17:18:17.050728669 +0000 UTC m=+106.962320729" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.051109 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s75tq" podStartSLOduration=63.051101362 podStartE2EDuration="1m3.051101362s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.044477758 +0000 UTC m=+106.956069828" watchObservedRunningTime="2026-03-20 17:18:17.051101362 +0000 UTC m=+106.962693432" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.051910 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.053183 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.553169612 +0000 UTC m=+107.464761682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.155950 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.157046 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.657031992 +0000 UTC m=+107.568624062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.158031 4803 generic.go:334] "Generic (PLEG): container finished" podID="dad4e80b-88f1-4e64-a7e6-136c1d3b6e67" containerID="15617a7ae405260f44115d0ad91e62bc2b15b0e94338f604ad696b7358827ea4" exitCode=0 Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.158091 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerDied","Data":"15617a7ae405260f44115d0ad91e62bc2b15b0e94338f604ad696b7358827ea4"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.159267 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" podStartSLOduration=8.159237596 podStartE2EDuration="8.159237596s" podCreationTimestamp="2026-03-20 17:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.14160755 +0000 UTC m=+107.053199620" watchObservedRunningTime="2026-03-20 17:18:17.159237596 +0000 UTC m=+107.070829666" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.175981 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.183663 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-xf7g5" podStartSLOduration=63.183638191 podStartE2EDuration="1m3.183638191s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.183084922 +0000 UTC m=+107.094676992" watchObservedRunningTime="2026-03-20 17:18:17.183638191 +0000 UTC m=+107.095230261" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.187633 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxk24" event={"ID":"c607a233-38be-49b0-9953-b6416f879c2e","Type":"ContainerStarted","Data":"83c4eab34d3e1273b0b127519a77a19dc8b2566fbfee5e22052505d3666d759e"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.187684 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-bxk24" event={"ID":"c607a233-38be-49b0-9953-b6416f879c2e","Type":"ContainerStarted","Data":"745d1b3a1f06f99fc9f99eeac53049b646aedc878c39e57ecebe72865b925e84"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.188297 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.226116 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-llxn2" event={"ID":"56b68c7b-2d4d-4628-9f1f-85ec48141f82","Type":"ContainerStarted","Data":"442eb27c3065f89ab870649b8ffbc19460d0371d62d0b9e8198b2ecd2834a883"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.230017 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" event={"ID":"34933fd8-3d5a-4dbd-adf3-1ccb51d52d00","Type":"ContainerStarted","Data":"995bee8b26ef5bb835ca90b9d2c1eea32bb33e9821a845e85d1707de57afcd95"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.230368 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.232949 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" event={"ID":"ea5c4417-cb46-4e98-be4f-473a658bd123","Type":"ContainerStarted","Data":"fec4e0e23171ff16e71a766904d2c3b0ab029de14e35fe2e935e6819fd114ae1"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.233021 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" event={"ID":"ea5c4417-cb46-4e98-be4f-473a658bd123","Type":"ContainerStarted","Data":"9ff03ec962fd0c9922b4974ea00d27ee4e9828489775a2f5a2535e998a3bb367"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.252326 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" event={"ID":"f4999b37-8783-4029-b4a5-7b8aa468e234","Type":"ContainerStarted","Data":"8c8929dc808043693132fec3e6653c2e0f1ab068e1934ab9b86d9ced0d8a1ad4"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.254001 4803 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-2b5vf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.254056 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" podUID="f4999b37-8783-4029-b4a5-7b8aa468e234" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/healthz\": dial tcp 10.217.0.26:8443: connect: connection refused" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.263910 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.266706 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.766691077 +0000 UTC m=+107.678283147 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.286067 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" event={"ID":"3c225286-1127-48e6-ae17-a55d8f21904e","Type":"ContainerStarted","Data":"4633a02c50c511015634a63746ed6588cb91bdd9685119b98975ab34c770089d"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.295785 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:17 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:17 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:17 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.295866 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.313115 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9vjd5" event={"ID":"a37cbdf6-3ad2-4bbf-8bf9-f3e3fcc41150","Type":"ContainerStarted","Data":"c0120cae1e106351393d04c876d8d8ae52c823b5e197d8882cb099da07518ca6"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.327211 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" podStartSLOduration=63.327180582 podStartE2EDuration="1m3.327180582s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.325081371 +0000 UTC m=+107.236673451" watchObservedRunningTime="2026-03-20 17:18:17.327180582 +0000 UTC m=+107.238772652" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.351807 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" event={"ID":"1ec8ef95-30b2-4ff1-9e3d-47f0f6739c09","Type":"ContainerStarted","Data":"e1fc330295f607613f059330d7bbe37daa86a6e1428f642558cbb6ede7604808"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.372210 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.374699 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.874671947 +0000 UTC m=+107.786264017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.374898 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.377618 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.877411939 +0000 UTC m=+107.789004009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.382715 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-bxk24" podStartSLOduration=8.382686247 podStartE2EDuration="8.382686247s" podCreationTimestamp="2026-03-20 17:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.372156412 +0000 UTC m=+107.283748502" watchObservedRunningTime="2026-03-20 17:18:17.382686247 +0000 UTC m=+107.294278317" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.406376 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" event={"ID":"49663406-8493-44f5-8778-f177379037b0","Type":"ContainerStarted","Data":"78e18c5fd85fc2ccb703b8d117cca50f6b116b3ab21700030ba6eb1c4c0efea1"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.407515 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.407895 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-llxn2" podStartSLOduration=63.407866118 podStartE2EDuration="1m3.407866118s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.407197926 +0000 UTC m=+107.318790016" watchObservedRunningTime="2026-03-20 17:18:17.407866118 +0000 UTC m=+107.319458188" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.415319 4803 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-wlnjj container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.415438 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" podUID="49663406-8493-44f5-8778-f177379037b0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.418296 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" event={"ID":"c5822cec-11d7-4d8f-a5cb-d78527689fe8","Type":"ContainerStarted","Data":"60c1d0416eeb6292097325b300be58644cdae748f6a84c187dbcef59533800aa"} Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.418439 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.419850 4803 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-g6tsw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.419927 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" podUID="50a15cf7-9fc3-45dc-a960-a85e930f8365" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.419855 4803 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-v5h9f container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.420018 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" podUID="c5822cec-11d7-4d8f-a5cb-d78527689fe8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.484140 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.484386 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.984353003 +0000 UTC m=+107.895945073 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.484702 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.489548 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:17.989506587 +0000 UTC m=+107.901098657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.516941 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.529524 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-rb6js" podStartSLOduration=63.529503789 podStartE2EDuration="1m3.529503789s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.528191795 +0000 UTC m=+107.439783865" watchObservedRunningTime="2026-03-20 17:18:17.529503789 +0000 UTC m=+107.441095859" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.529711 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" podStartSLOduration=63.529707146 podStartE2EDuration="1m3.529707146s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.458565062 +0000 UTC m=+107.370157152" watchObservedRunningTime="2026-03-20 17:18:17.529707146 +0000 UTC m=+107.441299216" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.588244 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.594711 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.094661411 +0000 UTC m=+108.006253471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.594874 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.595257 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.095250551 +0000 UTC m=+108.006842621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.613987 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" podStartSLOduration=63.613970524 podStartE2EDuration="1m3.613970524s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.613073863 +0000 UTC m=+107.524665933" watchObservedRunningTime="2026-03-20 17:18:17.613970524 +0000 UTC m=+107.525562594" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.696002 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.696400 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.196386189 +0000 UTC m=+108.107978249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.708842 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-hsljx" podStartSLOduration=63.708822999 podStartE2EDuration="1m3.708822999s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.643840033 +0000 UTC m=+107.555432103" watchObservedRunningTime="2026-03-20 17:18:17.708822999 +0000 UTC m=+107.620415069" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.710047 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" podStartSLOduration=63.71003889 podStartE2EDuration="1m3.71003889s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.7073616 +0000 UTC m=+107.618953670" watchObservedRunningTime="2026-03-20 17:18:17.71003889 +0000 UTC m=+107.621630960" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.760828 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9vjd5" podStartSLOduration=8.760800536 podStartE2EDuration="8.760800536s" podCreationTimestamp="2026-03-20 17:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:17.758216998 +0000 UTC m=+107.669809088" watchObservedRunningTime="2026-03-20 17:18:17.760800536 +0000 UTC m=+107.672392606" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.797623 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.798073 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.298057265 +0000 UTC m=+108.209649325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.813081 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6s6cl"] Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.880603 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.911820 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:17 crc kubenswrapper[4803]: E0320 17:18:17.912681 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.412652798 +0000 UTC m=+108.324244868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.948013 4803 ???:1] "http: TLS handshake error from 192.168.126.11:52610: no serving certificate available for the kubelet" Mar 20 17:18:17 crc kubenswrapper[4803]: I0320 17:18:17.970565 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hlq9h"] Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.014211 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.014566 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.514551121 +0000 UTC m=+108.426143191 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.038965 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.039518 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.047887 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.052051 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.074453 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.088311 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl"] Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.115777 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.116008 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.116062 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.116192 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.616178546 +0000 UTC m=+108.527770616 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.144603 4803 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-lrmdq container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.144716 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" podUID="34933fd8-3d5a-4dbd-adf3-1ccb51d52d00" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.145060 4803 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-lrmdq container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.145200 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" podUID="34933fd8-3d5a-4dbd-adf3-1ccb51d52d00" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.222075 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.222118 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.222359 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.222439 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.222694 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.722682625 +0000 UTC m=+108.634274695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.229384 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.229360001 podStartE2EDuration="1.229360001s" podCreationTimestamp="2026-03-20 17:18:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:18.225368936 +0000 UTC m=+108.136961016" watchObservedRunningTime="2026-03-20 17:18:18.229360001 +0000 UTC m=+108.140952071" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.284294 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.286411 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:18 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:18 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:18 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.286773 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.323270 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.323724 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.823705489 +0000 UTC m=+108.735297559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.358777 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.421606 4803 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tsv6s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.21:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.421664 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" podUID="baff85ab-57bf-49c5-8009-938ad47246aa" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.21:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.424498 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.424845 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:18.924833657 +0000 UTC m=+108.836425727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.524341 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-56rll" event={"ID":"dad4e80b-88f1-4e64-a7e6-136c1d3b6e67","Type":"ContainerStarted","Data":"b877f7b9106c7dcd99bea7d00187d6c63ba0c4461c4353e86213fe4ef92a5ab7"} Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.525550 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.525879 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.025865661 +0000 UTC m=+108.937457731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.525934 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.526193 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.026186682 +0000 UTC m=+108.937778752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.537252 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" event={"ID":"80befa86-f1dc-4d44-8d9a-7b50b557159d","Type":"ContainerStarted","Data":"501bca99e4286f521ba07413ead33f22802ca61b077d3477af90466a932b4eca"} Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.537309 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" event={"ID":"80befa86-f1dc-4d44-8d9a-7b50b557159d","Type":"ContainerStarted","Data":"1bee4270b6d5d04ed2fb271eb4e6b9ea854b72b8699de38807a803486f348344"} Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.539972 4803 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-g6tsw container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.540385 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" podUID="50a15cf7-9fc3-45dc-a960-a85e930f8365" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.541086 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" podUID="c9636dcd-d0c5-4b44-96ed-aa4230163735" containerName="controller-manager" containerID="cri-o://f01f0f9c8537248a1776b2db6051f6f0981d4cfd8dc742d678b97a0eee349e27" gracePeriod=30 Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.548822 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-wlnjj" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.570027 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-2b5vf" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.629337 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.629602 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.129580106 +0000 UTC m=+109.041172176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.630033 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.630306 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.130299501 +0000 UTC m=+109.041891571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.658998 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-56rll" podStartSLOduration=64.65898605 podStartE2EDuration="1m4.65898605s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:18.652116328 +0000 UTC m=+108.563708408" watchObservedRunningTime="2026-03-20 17:18:18.65898605 +0000 UTC m=+108.570578120" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.734030 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.734286 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.234260794 +0000 UTC m=+109.145852874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.734985 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.738326 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.238310641 +0000 UTC m=+109.149902711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.779812 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" podStartSLOduration=65.779792933 podStartE2EDuration="1m5.779792933s" podCreationTimestamp="2026-03-20 17:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:18.757760668 +0000 UTC m=+108.669352758" watchObservedRunningTime="2026-03-20 17:18:18.779792933 +0000 UTC m=+108.691384993" Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.838643 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.838911 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.33889711 +0000 UTC m=+109.250489180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:18 crc kubenswrapper[4803]: I0320 17:18:18.941293 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:18 crc kubenswrapper[4803]: E0320 17:18:18.942045 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.442030676 +0000 UTC m=+109.353622746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.044200 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lrmdq" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.045190 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.045469 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.545456331 +0000 UTC m=+109.457048401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.147770 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.148245 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.648229284 +0000 UTC m=+109.559821354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.168583 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.250110 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.251038 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.751022089 +0000 UTC m=+109.662614159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.289970 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:19 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:19 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:19 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.290072 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.352081 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.352552 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.852513768 +0000 UTC m=+109.764105828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.397658 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-v5h9f" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.465861 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.493102 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:19.993056348 +0000 UTC m=+109.904648418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.558818 4803 generic.go:334] "Generic (PLEG): container finished" podID="c9636dcd-d0c5-4b44-96ed-aa4230163735" containerID="f01f0f9c8537248a1776b2db6051f6f0981d4cfd8dc742d678b97a0eee349e27" exitCode=0 Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.558877 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" event={"ID":"c9636dcd-d0c5-4b44-96ed-aa4230163735","Type":"ContainerDied","Data":"f01f0f9c8537248a1776b2db6051f6f0981d4cfd8dc742d678b97a0eee349e27"} Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.561166 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a25cb952-33ff-4a43-b67f-040ee9ed83b1","Type":"ContainerStarted","Data":"d0e93ac84b16e2dad79c29c182a3f85eb1d9ae7511696d24e2a991848550baef"} Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.561333 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" gracePeriod=30 Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.562701 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" podUID="f11f2e60-24d7-44a3-bb32-f88bc05a50a8" containerName="route-controller-manager" containerID="cri-o://478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83" gracePeriod=30 Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.573087 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.573411 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.073400293 +0000 UTC m=+109.984992363 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.609759 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.675118 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.675214 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.175200054 +0000 UTC m=+110.086792124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.675405 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.678399 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.178377401 +0000 UTC m=+110.089969471 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.783118 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9636dcd-d0c5-4b44-96ed-aa4230163735-serving-cert\") pod \"c9636dcd-d0c5-4b44-96ed-aa4230163735\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.783189 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2sg2\" (UniqueName: \"kubernetes.io/projected/c9636dcd-d0c5-4b44-96ed-aa4230163735-kube-api-access-s2sg2\") pod \"c9636dcd-d0c5-4b44-96ed-aa4230163735\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.783226 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-client-ca\") pod \"c9636dcd-d0c5-4b44-96ed-aa4230163735\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.783346 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.783392 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-proxy-ca-bundles\") pod \"c9636dcd-d0c5-4b44-96ed-aa4230163735\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.783426 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-config\") pod \"c9636dcd-d0c5-4b44-96ed-aa4230163735\" (UID: \"c9636dcd-d0c5-4b44-96ed-aa4230163735\") " Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.784107 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-client-ca" (OuterVolumeSpecName: "client-ca") pod "c9636dcd-d0c5-4b44-96ed-aa4230163735" (UID: "c9636dcd-d0c5-4b44-96ed-aa4230163735"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.784214 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.284195617 +0000 UTC m=+110.195787687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.784243 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-config" (OuterVolumeSpecName: "config") pod "c9636dcd-d0c5-4b44-96ed-aa4230163735" (UID: "c9636dcd-d0c5-4b44-96ed-aa4230163735"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.784499 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c9636dcd-d0c5-4b44-96ed-aa4230163735" (UID: "c9636dcd-d0c5-4b44-96ed-aa4230163735"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.790038 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9636dcd-d0c5-4b44-96ed-aa4230163735-kube-api-access-s2sg2" (OuterVolumeSpecName: "kube-api-access-s2sg2") pod "c9636dcd-d0c5-4b44-96ed-aa4230163735" (UID: "c9636dcd-d0c5-4b44-96ed-aa4230163735"). InnerVolumeSpecName "kube-api-access-s2sg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.791391 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9636dcd-d0c5-4b44-96ed-aa4230163735-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c9636dcd-d0c5-4b44-96ed-aa4230163735" (UID: "c9636dcd-d0c5-4b44-96ed-aa4230163735"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.885566 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.885680 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.885692 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.885702 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9636dcd-d0c5-4b44-96ed-aa4230163735-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.885710 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2sg2\" (UniqueName: \"kubernetes.io/projected/c9636dcd-d0c5-4b44-96ed-aa4230163735-kube-api-access-s2sg2\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.885719 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9636dcd-d0c5-4b44-96ed-aa4230163735-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.885965 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.385954446 +0000 UTC m=+110.297546516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:19 crc kubenswrapper[4803]: I0320 17:18:19.988899 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:19 crc kubenswrapper[4803]: E0320 17:18:19.989535 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.489504106 +0000 UTC m=+110.401096176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.085111 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.090952 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.091415 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.591398039 +0000 UTC m=+110.502990109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.116464 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68bb848755-bf8k6"] Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.116821 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9636dcd-d0c5-4b44-96ed-aa4230163735" containerName="controller-manager" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.116854 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9636dcd-d0c5-4b44-96ed-aa4230163735" containerName="controller-manager" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.116869 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11f2e60-24d7-44a3-bb32-f88bc05a50a8" containerName="route-controller-manager" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.116874 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11f2e60-24d7-44a3-bb32-f88bc05a50a8" containerName="route-controller-manager" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.116999 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9636dcd-d0c5-4b44-96ed-aa4230163735" containerName="controller-manager" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.117021 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11f2e60-24d7-44a3-bb32-f88bc05a50a8" containerName="route-controller-manager" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.118674 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.141762 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68bb848755-bf8k6"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.203335 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-config\") pod \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.203419 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-serving-cert\") pod \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.203630 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.203781 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88rjn\" (UniqueName: \"kubernetes.io/projected/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-kube-api-access-88rjn\") pod \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.203810 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-client-ca\") pod \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\" (UID: \"f11f2e60-24d7-44a3-bb32-f88bc05a50a8\") " Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.204979 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-client-ca" (OuterVolumeSpecName: "client-ca") pod "f11f2e60-24d7-44a3-bb32-f88bc05a50a8" (UID: "f11f2e60-24d7-44a3-bb32-f88bc05a50a8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.205553 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-config" (OuterVolumeSpecName: "config") pod "f11f2e60-24d7-44a3-bb32-f88bc05a50a8" (UID: "f11f2e60-24d7-44a3-bb32-f88bc05a50a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.212428 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.712396528 +0000 UTC m=+110.623988598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.216686 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f11f2e60-24d7-44a3-bb32-f88bc05a50a8" (UID: "f11f2e60-24d7-44a3-bb32-f88bc05a50a8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.219263 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-kube-api-access-88rjn" (OuterVolumeSpecName: "kube-api-access-88rjn") pod "f11f2e60-24d7-44a3-bb32-f88bc05a50a8" (UID: "f11f2e60-24d7-44a3-bb32-f88bc05a50a8"). InnerVolumeSpecName "kube-api-access-88rjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.228859 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gddcq"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.234111 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gddcq"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.234259 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.237392 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.285667 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:20 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:20 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:20 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.285726 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.306482 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-proxy-ca-bundles\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.306593 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-config\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.306710 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7def941c-8b59-47e1-b0a8-b9b91e0ef645-serving-cert\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.306825 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm6rq\" (UniqueName: \"kubernetes.io/projected/7def941c-8b59-47e1-b0a8-b9b91e0ef645-kube-api-access-vm6rq\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.306893 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.306953 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-client-ca\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.307066 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88rjn\" (UniqueName: \"kubernetes.io/projected/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-kube-api-access-88rjn\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.307078 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.307104 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.307113 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f11f2e60-24d7-44a3-bb32-f88bc05a50a8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.307385 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.807373648 +0000 UTC m=+110.718965718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.410369 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.410578 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-config\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.410626 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7def941c-8b59-47e1-b0a8-b9b91e0ef645-serving-cert\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.410677 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-utilities\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.410709 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm6rq\" (UniqueName: \"kubernetes.io/projected/7def941c-8b59-47e1-b0a8-b9b91e0ef645-kube-api-access-vm6rq\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.410758 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gjntl"] Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.410859 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:20.910840835 +0000 UTC m=+110.822432905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.411712 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.410769 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vh4n\" (UniqueName: \"kubernetes.io/projected/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-kube-api-access-9vh4n\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.412274 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-client-ca\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.412434 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-catalog-content\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.412614 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-proxy-ca-bundles\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.413589 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-config\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.414226 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-client-ca\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.414295 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-proxy-ca-bundles\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.417601 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.427701 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7def941c-8b59-47e1-b0a8-b9b91e0ef645-serving-cert\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.431265 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm6rq\" (UniqueName: \"kubernetes.io/projected/7def941c-8b59-47e1-b0a8-b9b91e0ef645-kube-api-access-vm6rq\") pod \"controller-manager-68bb848755-bf8k6\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.433465 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjntl"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.478247 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.513498 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdrbd\" (UniqueName: \"kubernetes.io/projected/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-kube-api-access-pdrbd\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.513742 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-utilities\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.513858 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-utilities\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.513944 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.514016 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vh4n\" (UniqueName: \"kubernetes.io/projected/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-kube-api-access-9vh4n\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.514102 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-catalog-content\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.514182 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-catalog-content\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.514357 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:21.014340832 +0000 UTC m=+110.925932902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.514759 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-catalog-content\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.514830 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-utilities\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.537750 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vh4n\" (UniqueName: \"kubernetes.io/projected/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-kube-api-access-9vh4n\") pod \"certified-operators-gddcq\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.545980 4803 ???:1] "http: TLS handshake error from 192.168.126.11:52614: no serving certificate available for the kubelet" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.551406 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.564190 4803 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.573306 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" event={"ID":"c9636dcd-d0c5-4b44-96ed-aa4230163735","Type":"ContainerDied","Data":"98985f85e21b516f21d96173896b16c13ce34cbbb240df540ef00cb6e4005a0f"} Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.573370 4803 scope.go:117] "RemoveContainer" containerID="f01f0f9c8537248a1776b2db6051f6f0981d4cfd8dc742d678b97a0eee349e27" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.574242 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-hlq9h" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.576335 4803 generic.go:334] "Generic (PLEG): container finished" podID="f11f2e60-24d7-44a3-bb32-f88bc05a50a8" containerID="478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83" exitCode=0 Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.576502 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.576630 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" event={"ID":"f11f2e60-24d7-44a3-bb32-f88bc05a50a8","Type":"ContainerDied","Data":"478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83"} Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.577320 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl" event={"ID":"f11f2e60-24d7-44a3-bb32-f88bc05a50a8","Type":"ContainerDied","Data":"b2a1e9a2307192f35a033ad64bd672f625a4e25b827d41ba9be64566594b4f50"} Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.588641 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" event={"ID":"d0e2699e-927c-4274-9bcd-f20d91af15e5","Type":"ContainerStarted","Data":"6b6b7d3762c328b704aaa170834e26313365695b0cfb0b5b96e93de38949cffa"} Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.588695 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" event={"ID":"d0e2699e-927c-4274-9bcd-f20d91af15e5","Type":"ContainerStarted","Data":"77fccbf155a54f850e5b0f0a963334c616eecb7e9a47e56048888bbac6ec26c5"} Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.602245 4803 generic.go:334] "Generic (PLEG): container finished" podID="a25cb952-33ff-4a43-b67f-040ee9ed83b1" containerID="4e60538a609925f7494bbe16e1e5a4599a811f5101e2cc3eb4ecb6806fd1abdf" exitCode=0 Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.602603 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a25cb952-33ff-4a43-b67f-040ee9ed83b1","Type":"ContainerDied","Data":"4e60538a609925f7494bbe16e1e5a4599a811f5101e2cc3eb4ecb6806fd1abdf"} Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.617988 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vsnl7"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.626006 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.626755 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.627066 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-catalog-content\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.627112 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdrbd\" (UniqueName: \"kubernetes.io/projected/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-kube-api-access-pdrbd\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.627154 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-utilities\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.627442 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:21.127419323 +0000 UTC m=+111.039011393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.627802 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-catalog-content\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.628201 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-utilities\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.642850 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsnl7"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.647905 4803 scope.go:117] "RemoveContainer" containerID="478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.659160 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdrbd\" (UniqueName: \"kubernetes.io/projected/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-kube-api-access-pdrbd\") pod \"community-operators-gjntl\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.677151 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hlq9h"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.682221 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-hlq9h"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.682308 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.688861 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-86gtl"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.691741 4803 scope.go:117] "RemoveContainer" containerID="478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.693322 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83\": container with ID starting with 478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83 not found: ID does not exist" containerID="478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.693360 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83"} err="failed to get container status \"478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83\": rpc error: code = NotFound desc = could not find container \"478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83\": container with ID starting with 478008e50e5bbe5e75e4cbe43a55a79cb9f35a78823f3a8ede7b3a83ef175e83 not found: ID does not exist" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.728271 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-utilities\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.728320 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7z6\" (UniqueName: \"kubernetes.io/projected/198f3e33-8f27-4975-8d32-caa8e52db976-kube-api-access-ss7z6\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.728386 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-catalog-content\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.728455 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.728767 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:21.228754578 +0000 UTC m=+111.140346648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.760218 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.804721 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qf96s"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.805661 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.825438 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qf96s"] Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.829715 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.829920 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-catalog-content\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.830002 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-utilities\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.830043 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7z6\" (UniqueName: \"kubernetes.io/projected/198f3e33-8f27-4975-8d32-caa8e52db976-kube-api-access-ss7z6\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.830309 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68bb848755-bf8k6"] Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.830381 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:21.330367342 +0000 UTC m=+111.241959412 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.830740 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-catalog-content\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.830943 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-utilities\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.849735 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7z6\" (UniqueName: \"kubernetes.io/projected/198f3e33-8f27-4975-8d32-caa8e52db976-kube-api-access-ss7z6\") pod \"certified-operators-vsnl7\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:20 crc kubenswrapper[4803]: W0320 17:18:20.865648 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7def941c_8b59_47e1_b0a8_b9b91e0ef645.slice/crio-8cee681538f2ccc934609ea65f6c1944e309e3ff0be16183a29f781dc3c76e3a WatchSource:0}: Error finding container 8cee681538f2ccc934609ea65f6c1944e309e3ff0be16183a29f781dc3c76e3a: Status 404 returned error can't find the container with id 8cee681538f2ccc934609ea65f6c1944e309e3ff0be16183a29f781dc3c76e3a Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.891341 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9636dcd-d0c5-4b44-96ed-aa4230163735" path="/var/lib/kubelet/pods/c9636dcd-d0c5-4b44-96ed-aa4230163735/volumes" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.892364 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11f2e60-24d7-44a3-bb32-f88bc05a50a8" path="/var/lib/kubelet/pods/f11f2e60-24d7-44a3-bb32-f88bc05a50a8/volumes" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.892755 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gddcq"] Mar 20 17:18:20 crc kubenswrapper[4803]: W0320 17:18:20.908630 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaf113de_0d1a_4ddf_9ed5_01e25b1bb66e.slice/crio-a5a26bf058d1b32792ec7d11d6a8a7aff9e1c205f08d92179a602e28890ed1f8 WatchSource:0}: Error finding container a5a26bf058d1b32792ec7d11d6a8a7aff9e1c205f08d92179a602e28890ed1f8: Status 404 returned error can't find the container with id a5a26bf058d1b32792ec7d11d6a8a7aff9e1c205f08d92179a602e28890ed1f8 Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.931460 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-catalog-content\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.931541 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wpqd\" (UniqueName: \"kubernetes.io/projected/43d688d2-e15c-4129-8c1a-5b390313b012-kube-api-access-8wpqd\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.931570 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-utilities\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.931635 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:20 crc kubenswrapper[4803]: E0320 17:18:20.932305 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:21.432290066 +0000 UTC m=+111.343882136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:20 crc kubenswrapper[4803]: I0320 17:18:20.960040 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.033595 4803 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T17:18:20.564215307Z","Handler":null,"Name":""} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.038852 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:21 crc kubenswrapper[4803]: E0320 17:18:21.038927 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 17:18:21.538910749 +0000 UTC m=+111.450502809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.039286 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wpqd\" (UniqueName: \"kubernetes.io/projected/43d688d2-e15c-4129-8c1a-5b390313b012-kube-api-access-8wpqd\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.039329 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-utilities\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.039411 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.039460 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-catalog-content\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.039895 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-catalog-content\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:21 crc kubenswrapper[4803]: E0320 17:18:21.039961 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 17:18:21.539945934 +0000 UTC m=+111.451538004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kwfxw" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.039964 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-utilities\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.061797 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wpqd\" (UniqueName: \"kubernetes.io/projected/43d688d2-e15c-4129-8c1a-5b390313b012-kube-api-access-8wpqd\") pod \"community-operators-qf96s\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.065360 4803 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.065409 4803 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.082117 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjntl"] Mar 20 17:18:21 crc kubenswrapper[4803]: W0320 17:18:21.097988 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46aaed67_fb9c_41fa_9f9f_9bfb3b2dfbf3.slice/crio-8fe02b7fc67557ba58a009bd40e8fb745dbad605f300b1392512c4b80dc29a19 WatchSource:0}: Error finding container 8fe02b7fc67557ba58a009bd40e8fb745dbad605f300b1392512c4b80dc29a19: Status 404 returned error can't find the container with id 8fe02b7fc67557ba58a009bd40e8fb745dbad605f300b1392512c4b80dc29a19 Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.139918 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.146424 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.148468 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.240929 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.256100 4803 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.256168 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.302883 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:21 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:21 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:21 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.302961 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.341024 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vsnl7"] Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.404148 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kwfxw\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.475643 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qf96s"] Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.553456 4803 patch_prober.go:28] interesting pod/downloads-7954f5f757-mm9jg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.553505 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mm9jg" podUID="374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.553839 4803 patch_prober.go:28] interesting pod/downloads-7954f5f757-mm9jg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.553859 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mm9jg" podUID="374b0189-5a1d-4a8a-b6e1-9886ee8f4bd3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.622171 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" event={"ID":"7def941c-8b59-47e1-b0a8-b9b91e0ef645","Type":"ContainerStarted","Data":"027e59bd23fb30c4cc47864c3a126021709bd8100f6542e05d43d9e684efb61f"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.622211 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" event={"ID":"7def941c-8b59-47e1-b0a8-b9b91e0ef645","Type":"ContainerStarted","Data":"8cee681538f2ccc934609ea65f6c1944e309e3ff0be16183a29f781dc3c76e3a"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.623055 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.627002 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.629343 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsnl7" event={"ID":"198f3e33-8f27-4975-8d32-caa8e52db976","Type":"ContainerStarted","Data":"02f86a0e8ffc595fa97718bf0f8b9c8628903a49bc0cef8ff044e33facdb913b"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.637593 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" event={"ID":"d0e2699e-927c-4274-9bcd-f20d91af15e5","Type":"ContainerStarted","Data":"388295e007d21e2eb2829669b592a55e0b56971d3010da0600b46b84de86d8cf"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.639977 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.647582 4803 generic.go:334] "Generic (PLEG): container finished" podID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerID="25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79" exitCode=0 Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.647684 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjntl" event={"ID":"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3","Type":"ContainerDied","Data":"25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.647714 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjntl" event={"ID":"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3","Type":"ContainerStarted","Data":"8fe02b7fc67557ba58a009bd40e8fb745dbad605f300b1392512c4b80dc29a19"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.651957 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf96s" event={"ID":"43d688d2-e15c-4129-8c1a-5b390313b012","Type":"ContainerStarted","Data":"0c88ee11d2ae1423a68a257e93f1edcb66581e607380a74230cdf007d9feb318"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.665971 4803 generic.go:334] "Generic (PLEG): container finished" podID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerID="9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e" exitCode=0 Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.666741 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gddcq" event={"ID":"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e","Type":"ContainerDied","Data":"9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.666769 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gddcq" event={"ID":"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e","Type":"ContainerStarted","Data":"a5a26bf058d1b32792ec7d11d6a8a7aff9e1c205f08d92179a602e28890ed1f8"} Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.674545 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" podStartSLOduration=3.674514909 podStartE2EDuration="3.674514909s" podCreationTimestamp="2026-03-20 17:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:21.666978535 +0000 UTC m=+111.578570615" watchObservedRunningTime="2026-03-20 17:18:21.674514909 +0000 UTC m=+111.586106979" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.693124 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.693749 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.703241 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.703492 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.714862 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.717652 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-nsp5p" podStartSLOduration=12.717634067 podStartE2EDuration="12.717634067s" podCreationTimestamp="2026-03-20 17:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:21.711005413 +0000 UTC m=+111.622597483" watchObservedRunningTime="2026-03-20 17:18:21.717634067 +0000 UTC m=+111.629226137" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.851615 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.851709 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.938321 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.939654 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.942717 4803 patch_prober.go:28] interesting pod/console-f9d7485db-t6lj8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.942761 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-t6lj8" podUID="2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" containerName="console" probeResult="failure" output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.953108 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.953183 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.953270 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:21 crc kubenswrapper[4803]: I0320 17:18:21.992281 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.015910 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.039863 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.041286 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.046299 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.051908 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.118678 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w"] Mar 20 17:18:22 crc kubenswrapper[4803]: E0320 17:18:22.118965 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25cb952-33ff-4a43-b67f-040ee9ed83b1" containerName="pruner" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.118979 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25cb952-33ff-4a43-b67f-040ee9ed83b1" containerName="pruner" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.119087 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25cb952-33ff-4a43-b67f-040ee9ed83b1" containerName="pruner" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.119508 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.122862 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.122972 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.123242 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.123287 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.123444 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.127671 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.128224 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w"] Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.156949 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kubelet-dir\") pod \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\" (UID: \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\") " Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.158940 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kube-api-access\") pod \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\" (UID: \"a25cb952-33ff-4a43-b67f-040ee9ed83b1\") " Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.157124 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a25cb952-33ff-4a43-b67f-040ee9ed83b1" (UID: "a25cb952-33ff-4a43-b67f-040ee9ed83b1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.165723 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a25cb952-33ff-4a43-b67f-040ee9ed83b1" (UID: "a25cb952-33ff-4a43-b67f-040ee9ed83b1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.262483 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-config\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.263150 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th4qp\" (UniqueName: \"kubernetes.io/projected/58efc4b9-e931-4a7b-a306-a91f23b87a1f-kube-api-access-th4qp\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.263273 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58efc4b9-e931-4a7b-a306-a91f23b87a1f-serving-cert\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.263301 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-client-ca\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.263421 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.263447 4803 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a25cb952-33ff-4a43-b67f-040ee9ed83b1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.282622 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.289487 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:22 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:22 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:22 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.289536 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.321783 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwfxw"] Mar 20 17:18:22 crc kubenswrapper[4803]: W0320 17:18:22.362576 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbc5db9_573c_4314_9ebc_7b3e9e45f5bd.slice/crio-98f62eaf81b9d7d5110dafc6195e0d2fc28b5e0505d2157ae5839ad6452fc5c7 WatchSource:0}: Error finding container 98f62eaf81b9d7d5110dafc6195e0d2fc28b5e0505d2157ae5839ad6452fc5c7: Status 404 returned error can't find the container with id 98f62eaf81b9d7d5110dafc6195e0d2fc28b5e0505d2157ae5839ad6452fc5c7 Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.390604 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58efc4b9-e931-4a7b-a306-a91f23b87a1f-serving-cert\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.390653 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-client-ca\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.390701 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-config\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.390752 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th4qp\" (UniqueName: \"kubernetes.io/projected/58efc4b9-e931-4a7b-a306-a91f23b87a1f-kube-api-access-th4qp\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.391887 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-client-ca\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.395305 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.408395 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.409168 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-config\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.409476 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.419777 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58efc4b9-e931-4a7b-a306-a91f23b87a1f-serving-cert\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.425855 4803 patch_prober.go:28] interesting pod/apiserver-76f77b778f-zzf4k container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]log ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]etcd ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/max-in-flight-filter ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 17:18:22 crc kubenswrapper[4803]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 17:18:22 crc kubenswrapper[4803]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/openshift.io-startinformers ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 17:18:22 crc kubenswrapper[4803]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 17:18:22 crc kubenswrapper[4803]: livez check failed Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.425931 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" podUID="80befa86-f1dc-4d44-8d9a-7b50b557159d" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.429232 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th4qp\" (UniqueName: \"kubernetes.io/projected/58efc4b9-e931-4a7b-a306-a91f23b87a1f-kube-api-access-th4qp\") pod \"route-controller-manager-78c67d479b-25j4w\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.443077 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mr2dq"] Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.445825 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.445931 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.452970 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.473598 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.477552 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr2dq"] Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.505579 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8fms\" (UniqueName: \"kubernetes.io/projected/a5db8851-4faf-41b9-9f19-56ae943e1f07-kube-api-access-z8fms\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.505951 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-utilities\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.506084 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-catalog-content\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.550366 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.607772 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8fms\" (UniqueName: \"kubernetes.io/projected/a5db8851-4faf-41b9-9f19-56ae943e1f07-kube-api-access-z8fms\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.608780 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-utilities\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.610381 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-utilities\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.610688 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-catalog-content\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.611450 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-catalog-content\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.628750 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8fms\" (UniqueName: \"kubernetes.io/projected/a5db8851-4faf-41b9-9f19-56ae943e1f07-kube-api-access-z8fms\") pod \"redhat-marketplace-mr2dq\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.677226 4803 generic.go:334] "Generic (PLEG): container finished" podID="43d688d2-e15c-4129-8c1a-5b390313b012" containerID="3061a34269426f4684848f0238cba11b4e2e0b0259e3150500fc01d8eaaaaf69" exitCode=0 Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.677329 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf96s" event={"ID":"43d688d2-e15c-4129-8c1a-5b390313b012","Type":"ContainerDied","Data":"3061a34269426f4684848f0238cba11b4e2e0b0259e3150500fc01d8eaaaaf69"} Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.679159 4803 generic.go:334] "Generic (PLEG): container finished" podID="198f3e33-8f27-4975-8d32-caa8e52db976" containerID="6feff3542e2c7ae784dcdafe4db6b281694f53922faa1e45fcedf5b961c198b9" exitCode=0 Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.679194 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsnl7" event={"ID":"198f3e33-8f27-4975-8d32-caa8e52db976","Type":"ContainerDied","Data":"6feff3542e2c7ae784dcdafe4db6b281694f53922faa1e45fcedf5b961c198b9"} Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.700304 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5","Type":"ContainerStarted","Data":"22d2cd79f2e02dde58a0eb3831e0660a657ef86d6ed4c0ac83a79ddcdd735a88"} Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.706824 4803 generic.go:334] "Generic (PLEG): container finished" podID="ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a" containerID="acc2f987afb39df4999db0d2666d8d9e52c97902961a5c4f37dedb9b0bdb043e" exitCode=0 Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.706892 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" event={"ID":"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a","Type":"ContainerDied","Data":"acc2f987afb39df4999db0d2666d8d9e52c97902961a5c4f37dedb9b0bdb043e"} Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.710250 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a25cb952-33ff-4a43-b67f-040ee9ed83b1","Type":"ContainerDied","Data":"d0e93ac84b16e2dad79c29c182a3f85eb1d9ae7511696d24e2a991848550baef"} Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.710299 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0e93ac84b16e2dad79c29c182a3f85eb1d9ae7511696d24e2a991848550baef" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.710371 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.732055 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" event={"ID":"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd","Type":"ContainerStarted","Data":"18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0"} Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.732553 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" event={"ID":"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd","Type":"ContainerStarted","Data":"98f62eaf81b9d7d5110dafc6195e0d2fc28b5e0505d2157ae5839ad6452fc5c7"} Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.732570 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.746257 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hpjq8" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.766113 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" podStartSLOduration=68.766087729 podStartE2EDuration="1m8.766087729s" podCreationTimestamp="2026-03-20 17:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:22.763416379 +0000 UTC m=+112.675008459" watchObservedRunningTime="2026-03-20 17:18:22.766087729 +0000 UTC m=+112.677679829" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.814196 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.819994 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5dnfh"] Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.821191 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.838561 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7ffc\" (UniqueName: \"kubernetes.io/projected/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-kube-api-access-n7ffc\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.838754 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-utilities\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.838794 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-catalog-content\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.840141 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dnfh"] Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.881004 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.881821 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.885568 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w"] Mar 20 17:18:22 crc kubenswrapper[4803]: W0320 17:18:22.939164 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58efc4b9_e931_4a7b_a306_a91f23b87a1f.slice/crio-6b2558f4e142615923af90f06eb3a05845be2925bc6986ee431a7b5e0d83b047 WatchSource:0}: Error finding container 6b2558f4e142615923af90f06eb3a05845be2925bc6986ee431a7b5e0d83b047: Status 404 returned error can't find the container with id 6b2558f4e142615923af90f06eb3a05845be2925bc6986ee431a7b5e0d83b047 Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.939687 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7ffc\" (UniqueName: \"kubernetes.io/projected/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-kube-api-access-n7ffc\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.939883 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-utilities\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.940029 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-catalog-content\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.941112 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-utilities\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.943434 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-catalog-content\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:22 crc kubenswrapper[4803]: I0320 17:18:22.958205 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7ffc\" (UniqueName: \"kubernetes.io/projected/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-kube-api-access-n7ffc\") pod \"redhat-marketplace-5dnfh\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:23 crc kubenswrapper[4803]: E0320 17:18:23.067896 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:23 crc kubenswrapper[4803]: E0320 17:18:23.072214 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:23 crc kubenswrapper[4803]: E0320 17:18:23.073820 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:23 crc kubenswrapper[4803]: E0320 17:18:23.073849 4803 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.194818 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.291293 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:23 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:23 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:23 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.291377 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.382506 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr2dq"] Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.414876 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t78tx"] Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.415867 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.418150 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.425100 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t78tx"] Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.459945 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8r5q\" (UniqueName: \"kubernetes.io/projected/139768c1-c8fa-4890-952b-2a9f3e152ca3-kube-api-access-q8r5q\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.460030 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-catalog-content\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.460106 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-utilities\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.478233 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.478211135 podStartE2EDuration="1.478211135s" podCreationTimestamp="2026-03-20 17:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:23.461297794 +0000 UTC m=+113.372889874" watchObservedRunningTime="2026-03-20 17:18:23.478211135 +0000 UTC m=+113.389803205" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.564243 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-catalog-content\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.564571 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-utilities\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.564642 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8r5q\" (UniqueName: \"kubernetes.io/projected/139768c1-c8fa-4890-952b-2a9f3e152ca3-kube-api-access-q8r5q\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.565881 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-catalog-content\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.566063 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-utilities\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.586355 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8r5q\" (UniqueName: \"kubernetes.io/projected/139768c1-c8fa-4890-952b-2a9f3e152ca3-kube-api-access-q8r5q\") pod \"redhat-operators-t78tx\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.739895 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr2dq" event={"ID":"a5db8851-4faf-41b9-9f19-56ae943e1f07","Type":"ContainerStarted","Data":"05ec152d2b12af2991e4e227047c7db968a1983b123f5774695cc57ce0d81404"} Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.743788 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5","Type":"ContainerStarted","Data":"86272eb0738b6f9fd8a2156b90db414252f68df04d4b1acefd9e9cb983584ef3"} Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.746973 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" event={"ID":"58efc4b9-e931-4a7b-a306-a91f23b87a1f","Type":"ContainerStarted","Data":"6b2558f4e142615923af90f06eb3a05845be2925bc6986ee431a7b5e0d83b047"} Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.761044 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.761023673 podStartE2EDuration="2.761023673s" podCreationTimestamp="2026-03-20 17:18:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:23.759833843 +0000 UTC m=+113.671425923" watchObservedRunningTime="2026-03-20 17:18:23.761023673 +0000 UTC m=+113.672615743" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.761948 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.802110 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8ncw7"] Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.803616 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.814209 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8ncw7"] Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.860846 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dnfh"] Mar 20 17:18:23 crc kubenswrapper[4803]: W0320 17:18:23.863893 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d3c1fc_3d92_45a2_aa9c_23f7407aa531.slice/crio-9b3771d0550cf5ed649b95db1d193f394551dfe3c4b14958c137794b3d35b9a5 WatchSource:0}: Error finding container 9b3771d0550cf5ed649b95db1d193f394551dfe3c4b14958c137794b3d35b9a5: Status 404 returned error can't find the container with id 9b3771d0550cf5ed649b95db1d193f394551dfe3c4b14958c137794b3d35b9a5 Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.981921 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-catalog-content\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.981978 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-utilities\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:23 crc kubenswrapper[4803]: I0320 17:18:23.982134 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plbpg\" (UniqueName: \"kubernetes.io/projected/7e906a75-4a60-419b-9248-89a5e14229f0-kube-api-access-plbpg\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.005758 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.084729 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-catalog-content\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.084819 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-utilities\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.084904 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plbpg\" (UniqueName: \"kubernetes.io/projected/7e906a75-4a60-419b-9248-89a5e14229f0-kube-api-access-plbpg\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.085288 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-catalog-content\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.085405 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-utilities\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.103781 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plbpg\" (UniqueName: \"kubernetes.io/projected/7e906a75-4a60-419b-9248-89a5e14229f0-kube-api-access-plbpg\") pod \"redhat-operators-8ncw7\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.145463 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.185319 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-secret-volume\") pod \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.185423 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv25f\" (UniqueName: \"kubernetes.io/projected/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-kube-api-access-cv25f\") pod \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.185453 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume\") pod \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\" (UID: \"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a\") " Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.186492 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a" (UID: "ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.191100 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-kube-api-access-cv25f" (OuterVolumeSpecName: "kube-api-access-cv25f") pod "ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a" (UID: "ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a"). InnerVolumeSpecName "kube-api-access-cv25f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.197030 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a" (UID: "ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.268154 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t78tx"] Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.293727 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:24 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:24 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:24 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:24 crc kubenswrapper[4803]: W0320 17:18:24.293763 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod139768c1_c8fa_4890_952b_2a9f3e152ca3.slice/crio-070f7ce915aa86c7d1c3e0233e4733354dce6c39d63c09b6188e749cf8d26a64 WatchSource:0}: Error finding container 070f7ce915aa86c7d1c3e0233e4733354dce6c39d63c09b6188e749cf8d26a64: Status 404 returned error can't find the container with id 070f7ce915aa86c7d1c3e0233e4733354dce6c39d63c09b6188e749cf8d26a64 Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.293835 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.293932 4803 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.293958 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cv25f\" (UniqueName: \"kubernetes.io/projected/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-kube-api-access-cv25f\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.293967 4803 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.416097 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8ncw7"] Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.769765 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dnfh" event={"ID":"21d3c1fc-3d92-45a2-aa9c-23f7407aa531","Type":"ContainerStarted","Data":"9b3771d0550cf5ed649b95db1d193f394551dfe3c4b14958c137794b3d35b9a5"} Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.773974 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr2dq" event={"ID":"a5db8851-4faf-41b9-9f19-56ae943e1f07","Type":"ContainerStarted","Data":"64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984"} Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.777886 4803 generic.go:334] "Generic (PLEG): container finished" podID="7ce1774f-bcb9-45cd-bd86-9c38d67b89b5" containerID="86272eb0738b6f9fd8a2156b90db414252f68df04d4b1acefd9e9cb983584ef3" exitCode=0 Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.777936 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5","Type":"ContainerDied","Data":"86272eb0738b6f9fd8a2156b90db414252f68df04d4b1acefd9e9cb983584ef3"} Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.780026 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" event={"ID":"ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a","Type":"ContainerDied","Data":"e78a976bab2a67efa091d6ccbdda747f58e1630ad2b658f2a7afd2b83d5bafae"} Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.780061 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e78a976bab2a67efa091d6ccbdda747f58e1630ad2b658f2a7afd2b83d5bafae" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.780139 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg" Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.783078 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" event={"ID":"58efc4b9-e931-4a7b-a306-a91f23b87a1f","Type":"ContainerStarted","Data":"bd04e64dcf138ba2f8902f34a07c6f43ffc493caf1fc3812a7a8c1c87477ec01"} Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.784861 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ncw7" event={"ID":"7e906a75-4a60-419b-9248-89a5e14229f0","Type":"ContainerStarted","Data":"4f3f7cb0b9175f66b806d52fabb0a3f8293dd05c323d6576c04801b20dbfead9"} Mar 20 17:18:24 crc kubenswrapper[4803]: I0320 17:18:24.786486 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t78tx" event={"ID":"139768c1-c8fa-4890-952b-2a9f3e152ca3","Type":"ContainerStarted","Data":"070f7ce915aa86c7d1c3e0233e4733354dce6c39d63c09b6188e749cf8d26a64"} Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.285087 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:25 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:25 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:25 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.285142 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.692514 4803 ???:1] "http: TLS handshake error from 192.168.126.11:52626: no serving certificate available for the kubelet" Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.801443 4803 generic.go:334] "Generic (PLEG): container finished" podID="7e906a75-4a60-419b-9248-89a5e14229f0" containerID="15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512" exitCode=0 Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.801546 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ncw7" event={"ID":"7e906a75-4a60-419b-9248-89a5e14229f0","Type":"ContainerDied","Data":"15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512"} Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.803856 4803 generic.go:334] "Generic (PLEG): container finished" podID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerID="d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1" exitCode=0 Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.803920 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t78tx" event={"ID":"139768c1-c8fa-4890-952b-2a9f3e152ca3","Type":"ContainerDied","Data":"d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1"} Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.808292 4803 generic.go:334] "Generic (PLEG): container finished" podID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerID="6369edd74cf75ba67be0ebef298eb3eba9632e4ffa87ed0906825392bd22c018" exitCode=0 Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.808408 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dnfh" event={"ID":"21d3c1fc-3d92-45a2-aa9c-23f7407aa531","Type":"ContainerDied","Data":"6369edd74cf75ba67be0ebef298eb3eba9632e4ffa87ed0906825392bd22c018"} Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.815944 4803 generic.go:334] "Generic (PLEG): container finished" podID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerID="64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984" exitCode=0 Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.816049 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr2dq" event={"ID":"a5db8851-4faf-41b9-9f19-56ae943e1f07","Type":"ContainerDied","Data":"64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984"} Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.816586 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.821641 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:18:25 crc kubenswrapper[4803]: I0320 17:18:25.840342 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" podStartSLOduration=7.840322052 podStartE2EDuration="7.840322052s" podCreationTimestamp="2026-03-20 17:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:25.83937083 +0000 UTC m=+115.750962920" watchObservedRunningTime="2026-03-20 17:18:25.840322052 +0000 UTC m=+115.751914142" Mar 20 17:18:26 crc kubenswrapper[4803]: I0320 17:18:26.229606 4803 ???:1] "http: TLS handshake error from 192.168.126.11:52630: no serving certificate available for the kubelet" Mar 20 17:18:26 crc kubenswrapper[4803]: I0320 17:18:26.287367 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:26 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:26 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:26 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:26 crc kubenswrapper[4803]: I0320 17:18:26.287485 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:27 crc kubenswrapper[4803]: I0320 17:18:27.284688 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:27 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:27 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:27 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:27 crc kubenswrapper[4803]: I0320 17:18:27.284744 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:27 crc kubenswrapper[4803]: I0320 17:18:27.411272 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:27 crc kubenswrapper[4803]: I0320 17:18:27.415826 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zzf4k" Mar 20 17:18:27 crc kubenswrapper[4803]: I0320 17:18:27.727155 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-bxk24" Mar 20 17:18:28 crc kubenswrapper[4803]: I0320 17:18:28.286550 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:28 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:28 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:28 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:28 crc kubenswrapper[4803]: I0320 17:18:28.286854 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.288692 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:29 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:29 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:29 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.288779 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.513332 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.682970 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kube-api-access\") pod \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\" (UID: \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\") " Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.683365 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kubelet-dir\") pod \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\" (UID: \"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5\") " Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.683673 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7ce1774f-bcb9-45cd-bd86-9c38d67b89b5" (UID: "7ce1774f-bcb9-45cd-bd86-9c38d67b89b5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.687961 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7ce1774f-bcb9-45cd-bd86-9c38d67b89b5" (UID: "7ce1774f-bcb9-45cd-bd86-9c38d67b89b5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.784518 4803 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.784564 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce1774f-bcb9-45cd-bd86-9c38d67b89b5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.846998 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7ce1774f-bcb9-45cd-bd86-9c38d67b89b5","Type":"ContainerDied","Data":"22d2cd79f2e02dde58a0eb3831e0660a657ef86d6ed4c0ac83a79ddcdd735a88"} Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.847033 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22d2cd79f2e02dde58a0eb3831e0660a657ef86d6ed4c0ac83a79ddcdd735a88" Mar 20 17:18:29 crc kubenswrapper[4803]: I0320 17:18:29.847068 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 17:18:30 crc kubenswrapper[4803]: I0320 17:18:30.287040 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:30 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:30 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:30 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:30 crc kubenswrapper[4803]: I0320 17:18:30.287109 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:31 crc kubenswrapper[4803]: I0320 17:18:31.291327 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:31 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:31 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:31 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:31 crc kubenswrapper[4803]: I0320 17:18:31.291907 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:31 crc kubenswrapper[4803]: I0320 17:18:31.559045 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mm9jg" Mar 20 17:18:31 crc kubenswrapper[4803]: I0320 17:18:31.944674 4803 patch_prober.go:28] interesting pod/console-f9d7485db-t6lj8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Mar 20 17:18:31 crc kubenswrapper[4803]: I0320 17:18:31.944738 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-t6lj8" podUID="2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" containerName="console" probeResult="failure" output="Get \"https://10.217.0.36:8443/health\": dial tcp 10.217.0.36:8443: connect: connection refused" Mar 20 17:18:32 crc kubenswrapper[4803]: I0320 17:18:32.285205 4803 patch_prober.go:28] interesting pod/router-default-5444994796-r9r29 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 17:18:32 crc kubenswrapper[4803]: [-]has-synced failed: reason withheld Mar 20 17:18:32 crc kubenswrapper[4803]: [+]process-running ok Mar 20 17:18:32 crc kubenswrapper[4803]: healthz check failed Mar 20 17:18:32 crc kubenswrapper[4803]: I0320 17:18:32.285536 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r9r29" podUID="e60f36ac-4efd-493f-9903-a0311c9d6216" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 17:18:33 crc kubenswrapper[4803]: E0320 17:18:33.050746 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:33 crc kubenswrapper[4803]: E0320 17:18:33.052885 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:33 crc kubenswrapper[4803]: E0320 17:18:33.056025 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:33 crc kubenswrapper[4803]: E0320 17:18:33.056072 4803 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" Mar 20 17:18:33 crc kubenswrapper[4803]: I0320 17:18:33.287786 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:33 crc kubenswrapper[4803]: I0320 17:18:33.291686 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r9r29" Mar 20 17:18:35 crc kubenswrapper[4803]: I0320 17:18:35.860626 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 17:18:35 crc kubenswrapper[4803]: I0320 17:18:35.954445 4803 ???:1] "http: TLS handshake error from 192.168.126.11:58310: no serving certificate available for the kubelet" Mar 20 17:18:37 crc kubenswrapper[4803]: I0320 17:18:37.335437 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68bb848755-bf8k6"] Mar 20 17:18:37 crc kubenswrapper[4803]: I0320 17:18:37.335775 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" podUID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" containerName="controller-manager" containerID="cri-o://027e59bd23fb30c4cc47864c3a126021709bd8100f6542e05d43d9e684efb61f" gracePeriod=30 Mar 20 17:18:37 crc kubenswrapper[4803]: I0320 17:18:37.352060 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w"] Mar 20 17:18:37 crc kubenswrapper[4803]: I0320 17:18:37.352383 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" podUID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" containerName="route-controller-manager" containerID="cri-o://bd04e64dcf138ba2f8902f34a07c6f43ffc493caf1fc3812a7a8c1c87477ec01" gracePeriod=30 Mar 20 17:18:37 crc kubenswrapper[4803]: I0320 17:18:37.371725 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=2.37170147 podStartE2EDuration="2.37170147s" podCreationTimestamp="2026-03-20 17:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:18:37.366826146 +0000 UTC m=+127.278418286" watchObservedRunningTime="2026-03-20 17:18:37.37170147 +0000 UTC m=+127.283293550" Mar 20 17:18:38 crc kubenswrapper[4803]: I0320 17:18:38.904084 4803 generic.go:334] "Generic (PLEG): container finished" podID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" containerID="bd04e64dcf138ba2f8902f34a07c6f43ffc493caf1fc3812a7a8c1c87477ec01" exitCode=0 Mar 20 17:18:38 crc kubenswrapper[4803]: I0320 17:18:38.904142 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" event={"ID":"58efc4b9-e931-4a7b-a306-a91f23b87a1f","Type":"ContainerDied","Data":"bd04e64dcf138ba2f8902f34a07c6f43ffc493caf1fc3812a7a8c1c87477ec01"} Mar 20 17:18:38 crc kubenswrapper[4803]: I0320 17:18:38.905947 4803 generic.go:334] "Generic (PLEG): container finished" podID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" containerID="027e59bd23fb30c4cc47864c3a126021709bd8100f6542e05d43d9e684efb61f" exitCode=0 Mar 20 17:18:38 crc kubenswrapper[4803]: I0320 17:18:38.905969 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" event={"ID":"7def941c-8b59-47e1-b0a8-b9b91e0ef645","Type":"ContainerDied","Data":"027e59bd23fb30c4cc47864c3a126021709bd8100f6542e05d43d9e684efb61f"} Mar 20 17:18:40 crc kubenswrapper[4803]: I0320 17:18:40.480555 4803 patch_prober.go:28] interesting pod/controller-manager-68bb848755-bf8k6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 20 17:18:40 crc kubenswrapper[4803]: I0320 17:18:40.480613 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" podUID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.633966 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.777659 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.777719 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.777770 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.779095 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.780399 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.791278 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.794475 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.807259 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.883487 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.904807 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.940470 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:41 crc kubenswrapper[4803]: I0320 17:18:41.943829 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:18:42 crc kubenswrapper[4803]: I0320 17:18:42.447143 4803 patch_prober.go:28] interesting pod/route-controller-manager-78c67d479b-25j4w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 20 17:18:42 crc kubenswrapper[4803]: I0320 17:18:42.447814 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" podUID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 20 17:18:43 crc kubenswrapper[4803]: E0320 17:18:43.050452 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:43 crc kubenswrapper[4803]: E0320 17:18:43.051874 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:43 crc kubenswrapper[4803]: E0320 17:18:43.053514 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:43 crc kubenswrapper[4803]: E0320 17:18:43.053570 4803 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" Mar 20 17:18:43 crc kubenswrapper[4803]: I0320 17:18:43.906814 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:43 crc kubenswrapper[4803]: I0320 17:18:43.915834 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:43 crc kubenswrapper[4803]: I0320 17:18:43.997074 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:18:50 crc kubenswrapper[4803]: I0320 17:18:50.480327 4803 patch_prober.go:28] interesting pod/controller-manager-68bb848755-bf8k6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 20 17:18:50 crc kubenswrapper[4803]: I0320 17:18:50.480716 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" podUID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 20 17:18:51 crc kubenswrapper[4803]: E0320 17:18:51.751040 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84abd826_e5c1_4868_920c_10986d5e840c.slice/crio-19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:18:52 crc kubenswrapper[4803]: E0320 17:18:52.382673 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 17:18:52 crc kubenswrapper[4803]: E0320 17:18:52.383079 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pdrbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gjntl_openshift-marketplace(46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:18:52 crc kubenswrapper[4803]: E0320 17:18:52.385586 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gjntl" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" Mar 20 17:18:53 crc kubenswrapper[4803]: E0320 17:18:53.048195 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654 is running failed: container process not found" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:53 crc kubenswrapper[4803]: E0320 17:18:53.048903 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654 is running failed: container process not found" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:53 crc kubenswrapper[4803]: E0320 17:18:53.049239 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654 is running failed: container process not found" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:18:53 crc kubenswrapper[4803]: E0320 17:18:53.049365 4803 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" Mar 20 17:18:53 crc kubenswrapper[4803]: I0320 17:18:53.272204 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zxfx7" Mar 20 17:18:53 crc kubenswrapper[4803]: I0320 17:18:53.447214 4803 patch_prober.go:28] interesting pod/route-controller-manager-78c67d479b-25j4w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" start-of-body= Mar 20 17:18:53 crc kubenswrapper[4803]: I0320 17:18:53.447274 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" podUID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: i/o timeout" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.004504 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6s6cl_84abd826-e5c1-4868-920c-10986d5e840c/kube-multus-additional-cni-plugins/0.log" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.004587 4803 generic.go:334] "Generic (PLEG): container finished" podID="84abd826-e5c1-4868-920c-10986d5e840c" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" exitCode=137 Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.004636 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" event={"ID":"84abd826-e5c1-4868-920c-10986d5e840c","Type":"ContainerDied","Data":"19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654"} Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.888682 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:18:54 crc kubenswrapper[4803]: E0320 17:18:54.889009 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a" containerName="collect-profiles" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.889031 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a" containerName="collect-profiles" Mar 20 17:18:54 crc kubenswrapper[4803]: E0320 17:18:54.889051 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ce1774f-bcb9-45cd-bd86-9c38d67b89b5" containerName="pruner" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.889063 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ce1774f-bcb9-45cd-bd86-9c38d67b89b5" containerName="pruner" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.889253 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a" containerName="collect-profiles" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.889271 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ce1774f-bcb9-45cd-bd86-9c38d67b89b5" containerName="pruner" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.889845 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.893675 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.894639 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 17:18:54 crc kubenswrapper[4803]: I0320 17:18:54.904004 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:18:55 crc kubenswrapper[4803]: I0320 17:18:55.020649 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:18:55 crc kubenswrapper[4803]: I0320 17:18:55.020745 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:18:55 crc kubenswrapper[4803]: I0320 17:18:55.122875 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:18:55 crc kubenswrapper[4803]: I0320 17:18:55.123006 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:18:55 crc kubenswrapper[4803]: I0320 17:18:55.123167 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:18:55 crc kubenswrapper[4803]: I0320 17:18:55.161026 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:18:55 crc kubenswrapper[4803]: I0320 17:18:55.222346 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:18:55 crc kubenswrapper[4803]: E0320 17:18:55.703398 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gjntl" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.279367 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.282598 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.305098 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.362632 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.362696 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d349bec2-b61d-40f3-9331-580aac5a4d4d-kube-api-access\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.362831 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-var-lock\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.463827 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.463905 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d349bec2-b61d-40f3-9331-580aac5a4d4d-kube-api-access\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.463943 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.463965 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-var-lock\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.464074 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-var-lock\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.493867 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d349bec2-b61d-40f3-9331-580aac5a4d4d-kube-api-access\") pod \"installer-9-crc\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: I0320 17:19:00.618970 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:00 crc kubenswrapper[4803]: E0320 17:19:00.730934 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 17:19:00 crc kubenswrapper[4803]: E0320 17:19:00.731114 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:19:00 crc kubenswrapper[4803]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 17:19:00 crc kubenswrapper[4803]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7ztqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29567118-clg6s_openshift-infra(96658fb9-4742-457e-b7ec-384ef06ec6a8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 17:19:00 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:19:00 crc kubenswrapper[4803]: E0320 17:19:00.732352 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29567118-clg6s" podUID="96658fb9-4742-457e-b7ec-384ef06ec6a8" Mar 20 17:19:01 crc kubenswrapper[4803]: E0320 17:19:01.052807 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29567118-clg6s" podUID="96658fb9-4742-457e-b7ec-384ef06ec6a8" Mar 20 17:19:01 crc kubenswrapper[4803]: I0320 17:19:01.479613 4803 patch_prober.go:28] interesting pod/controller-manager-68bb848755-bf8k6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:19:01 crc kubenswrapper[4803]: I0320 17:19:01.479699 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" podUID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:19:02 crc kubenswrapper[4803]: E0320 17:19:02.455697 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 17:19:02 crc kubenswrapper[4803]: E0320 17:19:02.455917 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8wpqd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-qf96s_openshift-marketplace(43d688d2-e15c-4129-8c1a-5b390313b012): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:19:02 crc kubenswrapper[4803]: E0320 17:19:02.457357 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-qf96s" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" Mar 20 17:19:03 crc kubenswrapper[4803]: E0320 17:19:03.048794 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654 is running failed: container process not found" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:19:03 crc kubenswrapper[4803]: E0320 17:19:03.049410 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654 is running failed: container process not found" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:19:03 crc kubenswrapper[4803]: E0320 17:19:03.050273 4803 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654 is running failed: container process not found" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 20 17:19:03 crc kubenswrapper[4803]: E0320 17:19:03.050419 4803 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" Mar 20 17:19:03 crc kubenswrapper[4803]: I0320 17:19:03.448126 4803 patch_prober.go:28] interesting pod/route-controller-manager-78c67d479b-25j4w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 17:19:03 crc kubenswrapper[4803]: I0320 17:19:03.448464 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" podUID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 17:19:06 crc kubenswrapper[4803]: E0320 17:19:06.537250 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 17:19:06 crc kubenswrapper[4803]: E0320 17:19:06.538470 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9vh4n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-gddcq_openshift-marketplace(baf113de-0d1a-4ddf-9ed5-01e25b1bb66e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:19:06 crc kubenswrapper[4803]: E0320 17:19:06.540020 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-gddcq" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.858098 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-qf96s" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.858617 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-gddcq" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.914223 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.914606 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ss7z6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-vsnl7_openshift-marketplace(198f3e33-8f27-4975-8d32-caa8e52db976): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.915946 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-vsnl7" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.952213 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.952379 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plbpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8ncw7_openshift-marketplace(7e906a75-4a60-419b-9248-89a5e14229f0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.953606 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8ncw7" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.976441 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.976727 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q8r5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-t78tx_openshift-marketplace(139768c1-c8fa-4890-952b-2a9f3e152ca3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:19:08 crc kubenswrapper[4803]: E0320 17:19:08.978001 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-t78tx" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.314613 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-vsnl7" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.314789 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-t78tx" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.315379 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8ncw7" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.355581 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.355742 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n7ffc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5dnfh_openshift-marketplace(21d3c1fc-3d92-45a2-aa9c-23f7407aa531): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.357494 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5dnfh" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.415882 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.421669 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.426305 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6s6cl_84abd826-e5c1-4868-920c-10986d5e840c/kube-multus-additional-cni-plugins/0.log" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.426491 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.452769 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.453025 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8fms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mr2dq_openshift-marketplace(a5db8851-4faf-41b9-9f19-56ae943e1f07): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.454773 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mr2dq" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.473920 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449"] Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.474355 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.474431 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.474512 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" containerName="controller-manager" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.474597 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" containerName="controller-manager" Mar 20 17:19:10 crc kubenswrapper[4803]: E0320 17:19:10.474661 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" containerName="route-controller-manager" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.474717 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" containerName="route-controller-manager" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.475192 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="84abd826-e5c1-4868-920c-10986d5e840c" containerName="kube-multus-additional-cni-plugins" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.475277 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" containerName="controller-manager" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.475336 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" containerName="route-controller-manager" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.476097 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.483634 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449"] Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.535559 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58efc4b9-e931-4a7b-a306-a91f23b87a1f-serving-cert\") pod \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.535626 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm6rq\" (UniqueName: \"kubernetes.io/projected/7def941c-8b59-47e1-b0a8-b9b91e0ef645-kube-api-access-vm6rq\") pod \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.535677 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/84abd826-e5c1-4868-920c-10986d5e840c-ready\") pod \"84abd826-e5c1-4868-920c-10986d5e840c\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.538482 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84abd826-e5c1-4868-920c-10986d5e840c-ready" (OuterVolumeSpecName: "ready") pod "84abd826-e5c1-4868-920c-10986d5e840c" (UID: "84abd826-e5c1-4868-920c-10986d5e840c"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539587 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th4qp\" (UniqueName: \"kubernetes.io/projected/58efc4b9-e931-4a7b-a306-a91f23b87a1f-kube-api-access-th4qp\") pod \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539657 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84abd826-e5c1-4868-920c-10986d5e840c-cni-sysctl-allowlist\") pod \"84abd826-e5c1-4868-920c-10986d5e840c\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539676 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9qgd\" (UniqueName: \"kubernetes.io/projected/84abd826-e5c1-4868-920c-10986d5e840c-kube-api-access-p9qgd\") pod \"84abd826-e5c1-4868-920c-10986d5e840c\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539701 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7def941c-8b59-47e1-b0a8-b9b91e0ef645-serving-cert\") pod \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539748 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-proxy-ca-bundles\") pod \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539766 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-config\") pod \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539788 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-client-ca\") pod \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539806 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84abd826-e5c1-4868-920c-10986d5e840c-tuning-conf-dir\") pod \"84abd826-e5c1-4868-920c-10986d5e840c\" (UID: \"84abd826-e5c1-4868-920c-10986d5e840c\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539835 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-config\") pod \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\" (UID: \"7def941c-8b59-47e1-b0a8-b9b91e0ef645\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.539849 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-client-ca\") pod \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\" (UID: \"58efc4b9-e931-4a7b-a306-a91f23b87a1f\") " Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.540042 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-config\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.540105 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxk7h\" (UniqueName: \"kubernetes.io/projected/d7d21bfd-4a56-4665-a9db-e3558458f4bf-kube-api-access-xxk7h\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.540125 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d21bfd-4a56-4665-a9db-e3558458f4bf-serving-cert\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.540164 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-client-ca\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.540310 4803 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/84abd826-e5c1-4868-920c-10986d5e840c-ready\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.543048 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-config" (OuterVolumeSpecName: "config") pod "7def941c-8b59-47e1-b0a8-b9b91e0ef645" (UID: "7def941c-8b59-47e1-b0a8-b9b91e0ef645"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.543643 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-client-ca" (OuterVolumeSpecName: "client-ca") pod "7def941c-8b59-47e1-b0a8-b9b91e0ef645" (UID: "7def941c-8b59-47e1-b0a8-b9b91e0ef645"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.543678 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/84abd826-e5c1-4868-920c-10986d5e840c-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "84abd826-e5c1-4868-920c-10986d5e840c" (UID: "84abd826-e5c1-4868-920c-10986d5e840c"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.543939 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-config" (OuterVolumeSpecName: "config") pod "58efc4b9-e931-4a7b-a306-a91f23b87a1f" (UID: "58efc4b9-e931-4a7b-a306-a91f23b87a1f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.544098 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7def941c-8b59-47e1-b0a8-b9b91e0ef645" (UID: "7def941c-8b59-47e1-b0a8-b9b91e0ef645"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.544383 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84abd826-e5c1-4868-920c-10986d5e840c-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "84abd826-e5c1-4868-920c-10986d5e840c" (UID: "84abd826-e5c1-4868-920c-10986d5e840c"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.544697 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58efc4b9-e931-4a7b-a306-a91f23b87a1f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "58efc4b9-e931-4a7b-a306-a91f23b87a1f" (UID: "58efc4b9-e931-4a7b-a306-a91f23b87a1f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.544824 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7def941c-8b59-47e1-b0a8-b9b91e0ef645-kube-api-access-vm6rq" (OuterVolumeSpecName: "kube-api-access-vm6rq") pod "7def941c-8b59-47e1-b0a8-b9b91e0ef645" (UID: "7def941c-8b59-47e1-b0a8-b9b91e0ef645"). InnerVolumeSpecName "kube-api-access-vm6rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.547062 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58efc4b9-e931-4a7b-a306-a91f23b87a1f-kube-api-access-th4qp" (OuterVolumeSpecName: "kube-api-access-th4qp") pod "58efc4b9-e931-4a7b-a306-a91f23b87a1f" (UID: "58efc4b9-e931-4a7b-a306-a91f23b87a1f"). InnerVolumeSpecName "kube-api-access-th4qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.548430 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-client-ca" (OuterVolumeSpecName: "client-ca") pod "58efc4b9-e931-4a7b-a306-a91f23b87a1f" (UID: "58efc4b9-e931-4a7b-a306-a91f23b87a1f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.549320 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7def941c-8b59-47e1-b0a8-b9b91e0ef645-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7def941c-8b59-47e1-b0a8-b9b91e0ef645" (UID: "7def941c-8b59-47e1-b0a8-b9b91e0ef645"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.554608 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84abd826-e5c1-4868-920c-10986d5e840c-kube-api-access-p9qgd" (OuterVolumeSpecName: "kube-api-access-p9qgd") pod "84abd826-e5c1-4868-920c-10986d5e840c" (UID: "84abd826-e5c1-4868-920c-10986d5e840c"). InnerVolumeSpecName "kube-api-access-p9qgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641608 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-config\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641708 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxk7h\" (UniqueName: \"kubernetes.io/projected/d7d21bfd-4a56-4665-a9db-e3558458f4bf-kube-api-access-xxk7h\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641742 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d21bfd-4a56-4665-a9db-e3558458f4bf-serving-cert\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641778 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-client-ca\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641880 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm6rq\" (UniqueName: \"kubernetes.io/projected/7def941c-8b59-47e1-b0a8-b9b91e0ef645-kube-api-access-vm6rq\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641901 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th4qp\" (UniqueName: \"kubernetes.io/projected/58efc4b9-e931-4a7b-a306-a91f23b87a1f-kube-api-access-th4qp\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641914 4803 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/84abd826-e5c1-4868-920c-10986d5e840c-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641927 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9qgd\" (UniqueName: \"kubernetes.io/projected/84abd826-e5c1-4868-920c-10986d5e840c-kube-api-access-p9qgd\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641941 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7def941c-8b59-47e1-b0a8-b9b91e0ef645-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641953 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641966 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641977 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.641990 4803 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/84abd826-e5c1-4868-920c-10986d5e840c-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.642001 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7def941c-8b59-47e1-b0a8-b9b91e0ef645-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.642013 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/58efc4b9-e931-4a7b-a306-a91f23b87a1f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.642024 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58efc4b9-e931-4a7b-a306-a91f23b87a1f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.642762 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-config\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.643033 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-client-ca\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.645751 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d21bfd-4a56-4665-a9db-e3558458f4bf-serving-cert\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.657330 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxk7h\" (UniqueName: \"kubernetes.io/projected/d7d21bfd-4a56-4665-a9db-e3558458f4bf-kube-api-access-xxk7h\") pod \"route-controller-manager-66fffff6fc-nh449\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.810777 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.846087 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 17:19:10 crc kubenswrapper[4803]: I0320 17:19:10.908367 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 17:19:10 crc kubenswrapper[4803]: W0320 17:19:10.964106 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd349bec2_b61d_40f3_9331_580aac5a4d4d.slice/crio-43d0e7ca4d5414126bf1a8fa313030f40772857b7899656cab378d9eb4ab6ed0 WatchSource:0}: Error finding container 43d0e7ca4d5414126bf1a8fa313030f40772857b7899656cab378d9eb4ab6ed0: Status 404 returned error can't find the container with id 43d0e7ca4d5414126bf1a8fa313030f40772857b7899656cab378d9eb4ab6ed0 Mar 20 17:19:10 crc kubenswrapper[4803]: W0320 17:19:10.968691 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-88063758cba32f01b0fb885340dff004d9bf9a10e229adfcd5d4e181a3992797 WatchSource:0}: Error finding container 88063758cba32f01b0fb885340dff004d9bf9a10e229adfcd5d4e181a3992797: Status 404 returned error can't find the container with id 88063758cba32f01b0fb885340dff004d9bf9a10e229adfcd5d4e181a3992797 Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.085793 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449"] Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.154651 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7e939e3c0b32c431d85952c34f2a576e8913a5061c877ab35d9dc1d5284431e8"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.154695 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"88063758cba32f01b0fb885340dff004d9bf9a10e229adfcd5d4e181a3992797"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.154877 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.161483 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6400c992a206de1eb5860ab85b67ccc1e3e02ea3c88f7d26eb4ef1e4abd65967"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.161547 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ba56c9f30e0ffff6801ed4d35acc836cc59fbed00c4ebfc129d8efd77563ccb7"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.163626 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-6s6cl_84abd826-e5c1-4868-920c-10986d5e840c/kube-multus-additional-cni-plugins/0.log" Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.163738 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" event={"ID":"84abd826-e5c1-4868-920c-10986d5e840c","Type":"ContainerDied","Data":"9c636b04ed7c5ad82eea07c89829e53a12b5656c6fbf0557de3115bc7c9d8736"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.163787 4803 scope.go:117] "RemoveContainer" containerID="19d1e4abcb798d1c64471b7597811f42afe2eaa472802feacd485be3ac78e654" Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.163795 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-6s6cl" Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.173236 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" event={"ID":"7def941c-8b59-47e1-b0a8-b9b91e0ef645","Type":"ContainerDied","Data":"8cee681538f2ccc934609ea65f6c1944e309e3ff0be16183a29f781dc3c76e3a"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.173270 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68bb848755-bf8k6" Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.179303 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" event={"ID":"d7d21bfd-4a56-4665-a9db-e3558458f4bf","Type":"ContainerStarted","Data":"967da6aeec714198241a1eacb634159b3fe8884d0641331173b0b508044c7809"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.180254 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d349bec2-b61d-40f3-9331-580aac5a4d4d","Type":"ContainerStarted","Data":"43d0e7ca4d5414126bf1a8fa313030f40772857b7899656cab378d9eb4ab6ed0"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.181760 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" event={"ID":"58efc4b9-e931-4a7b-a306-a91f23b87a1f","Type":"ContainerDied","Data":"6b2558f4e142615923af90f06eb3a05845be2925bc6986ee431a7b5e0d83b047"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.182050 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w" Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.185733 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"7c9be265719f0dad898d1d21aa611c6a06b209581fa560b78c56e1499be8bc65"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.185797 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cce381c9f10795dcc0b3a2d33080d50f5d7287a336695c29c5be4f08b393c2cb"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.188803 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627","Type":"ContainerStarted","Data":"a601dffb1fd1e01d8a35d26579c2ceaa90a90f469c868314bf5af5acdd1e8622"} Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.211741 4803 scope.go:117] "RemoveContainer" containerID="027e59bd23fb30c4cc47864c3a126021709bd8100f6542e05d43d9e684efb61f" Mar 20 17:19:11 crc kubenswrapper[4803]: E0320 17:19:11.211794 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mr2dq" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" Mar 20 17:19:11 crc kubenswrapper[4803]: E0320 17:19:11.211880 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5dnfh" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.216899 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6s6cl"] Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.225321 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-6s6cl"] Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.237711 4803 scope.go:117] "RemoveContainer" containerID="bd04e64dcf138ba2f8902f34a07c6f43ffc493caf1fc3812a7a8c1c87477ec01" Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.253318 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-68bb848755-bf8k6"] Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.271594 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-68bb848755-bf8k6"] Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.309925 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w"] Mar 20 17:19:11 crc kubenswrapper[4803]: I0320 17:19:11.313509 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78c67d479b-25j4w"] Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.207050 4803 generic.go:334] "Generic (PLEG): container finished" podID="1d426fdb-5c4f-4b52-a3db-c7e5a52ba627" containerID="613e957daa22a4db475f9fa7fd5777e581046ce03288c931b310b513f820ed55" exitCode=0 Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.207495 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627","Type":"ContainerDied","Data":"613e957daa22a4db475f9fa7fd5777e581046ce03288c931b310b513f820ed55"} Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.210932 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" event={"ID":"d7d21bfd-4a56-4665-a9db-e3558458f4bf","Type":"ContainerStarted","Data":"7e41d00847fc912aec5a01c4f52241dbda823f8377f1803ed9e6c7ec119d0fd6"} Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.211901 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.216914 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d349bec2-b61d-40f3-9331-580aac5a4d4d","Type":"ContainerStarted","Data":"7bd2383ce333476283702b71154dfdab9f10b6c61dd0f2426307d18eed4e857b"} Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.222040 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.242964 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=12.242940924 podStartE2EDuration="12.242940924s" podCreationTimestamp="2026-03-20 17:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:19:12.241800219 +0000 UTC m=+162.153392309" watchObservedRunningTime="2026-03-20 17:19:12.242940924 +0000 UTC m=+162.154532994" Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.865177 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58efc4b9-e931-4a7b-a306-a91f23b87a1f" path="/var/lib/kubelet/pods/58efc4b9-e931-4a7b-a306-a91f23b87a1f/volumes" Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.866940 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7def941c-8b59-47e1-b0a8-b9b91e0ef645" path="/var/lib/kubelet/pods/7def941c-8b59-47e1-b0a8-b9b91e0ef645/volumes" Mar 20 17:19:12 crc kubenswrapper[4803]: I0320 17:19:12.868831 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84abd826-e5c1-4868-920c-10986d5e840c" path="/var/lib/kubelet/pods/84abd826-e5c1-4868-920c-10986d5e840c/volumes" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.146742 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" podStartSLOduration=16.146724981 podStartE2EDuration="16.146724981s" podCreationTimestamp="2026-03-20 17:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:19:12.269181744 +0000 UTC m=+162.180773814" watchObservedRunningTime="2026-03-20 17:19:13.146724981 +0000 UTC m=+163.058317061" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.151644 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d7689b4c-s77vf"] Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.163767 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.166143 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.166428 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.166613 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.167374 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.167369 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d7689b4c-s77vf"] Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.167778 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.176638 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-client-ca\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.176772 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rkf\" (UniqueName: \"kubernetes.io/projected/17fba358-cdfb-4ec0-9547-5352f9f35bd0-kube-api-access-w8rkf\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.176842 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-proxy-ca-bundles\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.176896 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-config\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.176930 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fba358-cdfb-4ec0-9547-5352f9f35bd0-serving-cert\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.178276 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.182993 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.237112 4803 generic.go:334] "Generic (PLEG): container finished" podID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerID="dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a" exitCode=0 Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.237262 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjntl" event={"ID":"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3","Type":"ContainerDied","Data":"dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a"} Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.278557 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rkf\" (UniqueName: \"kubernetes.io/projected/17fba358-cdfb-4ec0-9547-5352f9f35bd0-kube-api-access-w8rkf\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.279222 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-proxy-ca-bundles\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.279303 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-config\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.279366 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fba358-cdfb-4ec0-9547-5352f9f35bd0-serving-cert\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.279416 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-client-ca\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.280736 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-proxy-ca-bundles\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.280856 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-config\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.280905 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-client-ca\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.287393 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fba358-cdfb-4ec0-9547-5352f9f35bd0-serving-cert\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.298667 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rkf\" (UniqueName: \"kubernetes.io/projected/17fba358-cdfb-4ec0-9547-5352f9f35bd0-kube-api-access-w8rkf\") pod \"controller-manager-6d7689b4c-s77vf\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.447268 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.493886 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.583556 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kube-api-access\") pod \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\" (UID: \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\") " Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.583857 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kubelet-dir\") pod \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\" (UID: \"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627\") " Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.583989 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1d426fdb-5c4f-4b52-a3db-c7e5a52ba627" (UID: "1d426fdb-5c4f-4b52-a3db-c7e5a52ba627"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.592107 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1d426fdb-5c4f-4b52-a3db-c7e5a52ba627" (UID: "1d426fdb-5c4f-4b52-a3db-c7e5a52ba627"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.685575 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.685618 4803 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d426fdb-5c4f-4b52-a3db-c7e5a52ba627-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:13 crc kubenswrapper[4803]: I0320 17:19:13.691135 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d7689b4c-s77vf"] Mar 20 17:19:13 crc kubenswrapper[4803]: W0320 17:19:13.700759 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17fba358_cdfb_4ec0_9547_5352f9f35bd0.slice/crio-c9d8c1c15744aa1e9f3e3587ec43e28d800f2dbed6158aed308592b337884e19 WatchSource:0}: Error finding container c9d8c1c15744aa1e9f3e3587ec43e28d800f2dbed6158aed308592b337884e19: Status 404 returned error can't find the container with id c9d8c1c15744aa1e9f3e3587ec43e28d800f2dbed6158aed308592b337884e19 Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.244086 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"1d426fdb-5c4f-4b52-a3db-c7e5a52ba627","Type":"ContainerDied","Data":"a601dffb1fd1e01d8a35d26579c2ceaa90a90f469c868314bf5af5acdd1e8622"} Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.244355 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a601dffb1fd1e01d8a35d26579c2ceaa90a90f469c868314bf5af5acdd1e8622" Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.244101 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.245355 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" event={"ID":"17fba358-cdfb-4ec0-9547-5352f9f35bd0","Type":"ContainerStarted","Data":"bfa199081ac426342a619740bb7b4fbd109c20590df25107f84181e36c312032"} Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.245419 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" event={"ID":"17fba358-cdfb-4ec0-9547-5352f9f35bd0","Type":"ContainerStarted","Data":"c9d8c1c15744aa1e9f3e3587ec43e28d800f2dbed6158aed308592b337884e19"} Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.245709 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.247220 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjntl" event={"ID":"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3","Type":"ContainerStarted","Data":"8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f"} Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.252948 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.270010 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" podStartSLOduration=17.269989611 podStartE2EDuration="17.269989611s" podCreationTimestamp="2026-03-20 17:18:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:19:14.267545654 +0000 UTC m=+164.179137744" watchObservedRunningTime="2026-03-20 17:19:14.269989611 +0000 UTC m=+164.181581691" Mar 20 17:19:14 crc kubenswrapper[4803]: I0320 17:19:14.325316 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gjntl" podStartSLOduration=2.27750358 podStartE2EDuration="54.325298408s" podCreationTimestamp="2026-03-20 17:18:20 +0000 UTC" firstStartedPulling="2026-03-20 17:18:21.650885901 +0000 UTC m=+111.562477971" lastFinishedPulling="2026-03-20 17:19:13.698680729 +0000 UTC m=+163.610272799" observedRunningTime="2026-03-20 17:19:14.323934855 +0000 UTC m=+164.235526935" watchObservedRunningTime="2026-03-20 17:19:14.325298408 +0000 UTC m=+164.236890468" Mar 20 17:19:16 crc kubenswrapper[4803]: I0320 17:19:16.014654 4803 csr.go:261] certificate signing request csr-7qnm2 is approved, waiting to be issued Mar 20 17:19:16 crc kubenswrapper[4803]: I0320 17:19:16.023275 4803 csr.go:257] certificate signing request csr-7qnm2 is issued Mar 20 17:19:16 crc kubenswrapper[4803]: I0320 17:19:16.261818 4803 generic.go:334] "Generic (PLEG): container finished" podID="96658fb9-4742-457e-b7ec-384ef06ec6a8" containerID="e3af574ba8a9730f62410c91e8662dd86c731535ce0fdd64402ef40a0e98f056" exitCode=0 Mar 20 17:19:16 crc kubenswrapper[4803]: I0320 17:19:16.261924 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567118-clg6s" event={"ID":"96658fb9-4742-457e-b7ec-384ef06ec6a8","Type":"ContainerDied","Data":"e3af574ba8a9730f62410c91e8662dd86c731535ce0fdd64402ef40a0e98f056"} Mar 20 17:19:17 crc kubenswrapper[4803]: I0320 17:19:17.024405 4803 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-17 10:56:18.026866851 +0000 UTC Mar 20 17:19:17 crc kubenswrapper[4803]: I0320 17:19:17.026509 4803 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6521h37m1.000368111s for next certificate rotation Mar 20 17:19:17 crc kubenswrapper[4803]: I0320 17:19:17.596598 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567118-clg6s" Mar 20 17:19:17 crc kubenswrapper[4803]: I0320 17:19:17.748048 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ztqg\" (UniqueName: \"kubernetes.io/projected/96658fb9-4742-457e-b7ec-384ef06ec6a8-kube-api-access-7ztqg\") pod \"96658fb9-4742-457e-b7ec-384ef06ec6a8\" (UID: \"96658fb9-4742-457e-b7ec-384ef06ec6a8\") " Mar 20 17:19:17 crc kubenswrapper[4803]: I0320 17:19:17.759605 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96658fb9-4742-457e-b7ec-384ef06ec6a8-kube-api-access-7ztqg" (OuterVolumeSpecName: "kube-api-access-7ztqg") pod "96658fb9-4742-457e-b7ec-384ef06ec6a8" (UID: "96658fb9-4742-457e-b7ec-384ef06ec6a8"). InnerVolumeSpecName "kube-api-access-7ztqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:17 crc kubenswrapper[4803]: I0320 17:19:17.848923 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ztqg\" (UniqueName: \"kubernetes.io/projected/96658fb9-4742-457e-b7ec-384ef06ec6a8-kube-api-access-7ztqg\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:18 crc kubenswrapper[4803]: I0320 17:19:18.027180 4803 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-21 11:50:02.255170272 +0000 UTC Mar 20 17:19:18 crc kubenswrapper[4803]: I0320 17:19:18.027217 4803 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5898h30m44.22795587s for next certificate rotation Mar 20 17:19:18 crc kubenswrapper[4803]: I0320 17:19:18.279737 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567118-clg6s" event={"ID":"96658fb9-4742-457e-b7ec-384ef06ec6a8","Type":"ContainerDied","Data":"ad2a6326995dec5239845e2a6a52239d479bbc5184e792b77908d867220f9e5f"} Mar 20 17:19:18 crc kubenswrapper[4803]: I0320 17:19:18.279784 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad2a6326995dec5239845e2a6a52239d479bbc5184e792b77908d867220f9e5f" Mar 20 17:19:18 crc kubenswrapper[4803]: I0320 17:19:18.279870 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567118-clg6s" Mar 20 17:19:20 crc kubenswrapper[4803]: I0320 17:19:20.763289 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:19:20 crc kubenswrapper[4803]: I0320 17:19:20.763721 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:19:21 crc kubenswrapper[4803]: I0320 17:19:21.226157 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:19:21 crc kubenswrapper[4803]: I0320 17:19:21.299106 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf96s" event={"ID":"43d688d2-e15c-4129-8c1a-5b390313b012","Type":"ContainerStarted","Data":"8f5c72410efc7f8d5fcffb7aca3b3b571492fb7f3f17cea69d7aa0b6fb8ed0ae"} Mar 20 17:19:21 crc kubenswrapper[4803]: I0320 17:19:21.336516 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:19:22 crc kubenswrapper[4803]: I0320 17:19:22.304628 4803 generic.go:334] "Generic (PLEG): container finished" podID="43d688d2-e15c-4129-8c1a-5b390313b012" containerID="8f5c72410efc7f8d5fcffb7aca3b3b571492fb7f3f17cea69d7aa0b6fb8ed0ae" exitCode=0 Mar 20 17:19:22 crc kubenswrapper[4803]: I0320 17:19:22.304731 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf96s" event={"ID":"43d688d2-e15c-4129-8c1a-5b390313b012","Type":"ContainerDied","Data":"8f5c72410efc7f8d5fcffb7aca3b3b571492fb7f3f17cea69d7aa0b6fb8ed0ae"} Mar 20 17:19:25 crc kubenswrapper[4803]: I0320 17:19:25.322378 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsnl7" event={"ID":"198f3e33-8f27-4975-8d32-caa8e52db976","Type":"ContainerStarted","Data":"9f8c50bdb93287488b1047c1770812d09f5cf3ff0e0a78b078991900d9029491"} Mar 20 17:19:25 crc kubenswrapper[4803]: I0320 17:19:25.324358 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr2dq" event={"ID":"a5db8851-4faf-41b9-9f19-56ae943e1f07","Type":"ContainerStarted","Data":"31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a"} Mar 20 17:19:25 crc kubenswrapper[4803]: I0320 17:19:25.326181 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf96s" event={"ID":"43d688d2-e15c-4129-8c1a-5b390313b012","Type":"ContainerStarted","Data":"f09bfeb23e0a3cafe001066489bdc72d15559f3e0000871bd094631a866b8124"} Mar 20 17:19:25 crc kubenswrapper[4803]: I0320 17:19:25.327913 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gddcq" event={"ID":"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e","Type":"ContainerStarted","Data":"765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3"} Mar 20 17:19:25 crc kubenswrapper[4803]: I0320 17:19:25.329749 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t78tx" event={"ID":"139768c1-c8fa-4890-952b-2a9f3e152ca3","Type":"ContainerStarted","Data":"982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e"} Mar 20 17:19:25 crc kubenswrapper[4803]: I0320 17:19:25.410289 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qf96s" podStartSLOduration=3.606843994 podStartE2EDuration="1m5.410272072s" podCreationTimestamp="2026-03-20 17:18:20 +0000 UTC" firstStartedPulling="2026-03-20 17:18:22.679875675 +0000 UTC m=+112.591467745" lastFinishedPulling="2026-03-20 17:19:24.483303753 +0000 UTC m=+174.394895823" observedRunningTime="2026-03-20 17:19:25.406824815 +0000 UTC m=+175.318416885" watchObservedRunningTime="2026-03-20 17:19:25.410272072 +0000 UTC m=+175.321864142" Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.336315 4803 generic.go:334] "Generic (PLEG): container finished" podID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerID="31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a" exitCode=0 Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.336415 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr2dq" event={"ID":"a5db8851-4faf-41b9-9f19-56ae943e1f07","Type":"ContainerDied","Data":"31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a"} Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.340004 4803 generic.go:334] "Generic (PLEG): container finished" podID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerID="765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3" exitCode=0 Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.340138 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gddcq" event={"ID":"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e","Type":"ContainerDied","Data":"765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3"} Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.342348 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ncw7" event={"ID":"7e906a75-4a60-419b-9248-89a5e14229f0","Type":"ContainerStarted","Data":"7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19"} Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.344639 4803 generic.go:334] "Generic (PLEG): container finished" podID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerID="982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e" exitCode=0 Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.344700 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t78tx" event={"ID":"139768c1-c8fa-4890-952b-2a9f3e152ca3","Type":"ContainerDied","Data":"982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e"} Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.349332 4803 generic.go:334] "Generic (PLEG): container finished" podID="198f3e33-8f27-4975-8d32-caa8e52db976" containerID="9f8c50bdb93287488b1047c1770812d09f5cf3ff0e0a78b078991900d9029491" exitCode=0 Mar 20 17:19:26 crc kubenswrapper[4803]: I0320 17:19:26.349361 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsnl7" event={"ID":"198f3e33-8f27-4975-8d32-caa8e52db976","Type":"ContainerDied","Data":"9f8c50bdb93287488b1047c1770812d09f5cf3ff0e0a78b078991900d9029491"} Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.356075 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsnl7" event={"ID":"198f3e33-8f27-4975-8d32-caa8e52db976","Type":"ContainerStarted","Data":"f989f6268faff4205221804cd6257a28d0e974cb2b4a57ee487950413532d5d3"} Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.358555 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr2dq" event={"ID":"a5db8851-4faf-41b9-9f19-56ae943e1f07","Type":"ContainerStarted","Data":"c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135"} Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.360474 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gddcq" event={"ID":"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e","Type":"ContainerStarted","Data":"f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b"} Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.362930 4803 generic.go:334] "Generic (PLEG): container finished" podID="7e906a75-4a60-419b-9248-89a5e14229f0" containerID="7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19" exitCode=0 Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.362997 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ncw7" event={"ID":"7e906a75-4a60-419b-9248-89a5e14229f0","Type":"ContainerDied","Data":"7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19"} Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.365119 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t78tx" event={"ID":"139768c1-c8fa-4890-952b-2a9f3e152ca3","Type":"ContainerStarted","Data":"455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c"} Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.366999 4803 generic.go:334] "Generic (PLEG): container finished" podID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerID="c3ca2636e6377d51b6d427de6ad2d0a1c28d4f881f7c2e93cde69e68de9023fb" exitCode=0 Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.367026 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dnfh" event={"ID":"21d3c1fc-3d92-45a2-aa9c-23f7407aa531","Type":"ContainerDied","Data":"c3ca2636e6377d51b6d427de6ad2d0a1c28d4f881f7c2e93cde69e68de9023fb"} Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.383017 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vsnl7" podStartSLOduration=3.137866685 podStartE2EDuration="1m7.383002433s" podCreationTimestamp="2026-03-20 17:18:20 +0000 UTC" firstStartedPulling="2026-03-20 17:18:22.694066745 +0000 UTC m=+112.605658815" lastFinishedPulling="2026-03-20 17:19:26.939202483 +0000 UTC m=+176.850794563" observedRunningTime="2026-03-20 17:19:27.382126356 +0000 UTC m=+177.293718426" watchObservedRunningTime="2026-03-20 17:19:27.383002433 +0000 UTC m=+177.294594503" Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.400819 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gddcq" podStartSLOduration=2.234792013 podStartE2EDuration="1m7.400802479s" podCreationTimestamp="2026-03-20 17:18:20 +0000 UTC" firstStartedPulling="2026-03-20 17:18:21.672940346 +0000 UTC m=+111.584532416" lastFinishedPulling="2026-03-20 17:19:26.838950802 +0000 UTC m=+176.750542882" observedRunningTime="2026-03-20 17:19:27.397666691 +0000 UTC m=+177.309258771" watchObservedRunningTime="2026-03-20 17:19:27.400802479 +0000 UTC m=+177.312394549" Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.435706 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t78tx" podStartSLOduration=3.133886829 podStartE2EDuration="1m4.435689249s" podCreationTimestamp="2026-03-20 17:18:23 +0000 UTC" firstStartedPulling="2026-03-20 17:18:25.808243248 +0000 UTC m=+115.719835318" lastFinishedPulling="2026-03-20 17:19:27.110045668 +0000 UTC m=+177.021637738" observedRunningTime="2026-03-20 17:19:27.435618456 +0000 UTC m=+177.347210546" watchObservedRunningTime="2026-03-20 17:19:27.435689249 +0000 UTC m=+177.347281309" Mar 20 17:19:27 crc kubenswrapper[4803]: I0320 17:19:27.473110 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mr2dq" podStartSLOduration=4.40852286 podStartE2EDuration="1m5.473093007s" podCreationTimestamp="2026-03-20 17:18:22 +0000 UTC" firstStartedPulling="2026-03-20 17:18:25.821866658 +0000 UTC m=+115.733458728" lastFinishedPulling="2026-03-20 17:19:26.886436795 +0000 UTC m=+176.798028875" observedRunningTime="2026-03-20 17:19:27.471515898 +0000 UTC m=+177.383107978" watchObservedRunningTime="2026-03-20 17:19:27.473093007 +0000 UTC m=+177.384685077" Mar 20 17:19:28 crc kubenswrapper[4803]: I0320 17:19:28.374611 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ncw7" event={"ID":"7e906a75-4a60-419b-9248-89a5e14229f0","Type":"ContainerStarted","Data":"5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c"} Mar 20 17:19:28 crc kubenswrapper[4803]: I0320 17:19:28.377886 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dnfh" event={"ID":"21d3c1fc-3d92-45a2-aa9c-23f7407aa531","Type":"ContainerStarted","Data":"9dc4234426186ec2da8118b672f6ca6e607c504504d86a0dfc4bee9e37704015"} Mar 20 17:19:28 crc kubenswrapper[4803]: I0320 17:19:28.395647 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8ncw7" podStartSLOduration=3.1573267720000002 podStartE2EDuration="1m5.395628049s" podCreationTimestamp="2026-03-20 17:18:23 +0000 UTC" firstStartedPulling="2026-03-20 17:18:25.805299698 +0000 UTC m=+115.716891768" lastFinishedPulling="2026-03-20 17:19:28.043600975 +0000 UTC m=+177.955193045" observedRunningTime="2026-03-20 17:19:28.392310345 +0000 UTC m=+178.303902425" watchObservedRunningTime="2026-03-20 17:19:28.395628049 +0000 UTC m=+178.307220129" Mar 20 17:19:28 crc kubenswrapper[4803]: I0320 17:19:28.428501 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5dnfh" podStartSLOduration=4.471787258 podStartE2EDuration="1m6.428484115s" podCreationTimestamp="2026-03-20 17:18:22 +0000 UTC" firstStartedPulling="2026-03-20 17:18:25.814633014 +0000 UTC m=+115.726225084" lastFinishedPulling="2026-03-20 17:19:27.771329871 +0000 UTC m=+177.682921941" observedRunningTime="2026-03-20 17:19:28.423885611 +0000 UTC m=+178.335477691" watchObservedRunningTime="2026-03-20 17:19:28.428484115 +0000 UTC m=+178.340076195" Mar 20 17:19:30 crc kubenswrapper[4803]: I0320 17:19:30.552805 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:19:30 crc kubenswrapper[4803]: I0320 17:19:30.553117 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:19:30 crc kubenswrapper[4803]: I0320 17:19:30.598285 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:19:30 crc kubenswrapper[4803]: I0320 17:19:30.960802 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:19:30 crc kubenswrapper[4803]: I0320 17:19:30.960927 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:19:31 crc kubenswrapper[4803]: I0320 17:19:31.018166 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:19:31 crc kubenswrapper[4803]: I0320 17:19:31.148428 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:19:31 crc kubenswrapper[4803]: I0320 17:19:31.148486 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:19:31 crc kubenswrapper[4803]: I0320 17:19:31.184082 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:19:31 crc kubenswrapper[4803]: I0320 17:19:31.452110 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:19:31 crc kubenswrapper[4803]: I0320 17:19:31.468310 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:19:32 crc kubenswrapper[4803]: I0320 17:19:32.446577 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:19:32 crc kubenswrapper[4803]: I0320 17:19:32.815654 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:19:32 crc kubenswrapper[4803]: I0320 17:19:32.815723 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:19:32 crc kubenswrapper[4803]: I0320 17:19:32.867774 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:19:33 crc kubenswrapper[4803]: I0320 17:19:33.195818 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:19:33 crc kubenswrapper[4803]: I0320 17:19:33.197034 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:19:33 crc kubenswrapper[4803]: I0320 17:19:33.243016 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:19:33 crc kubenswrapper[4803]: I0320 17:19:33.462150 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:19:33 crc kubenswrapper[4803]: I0320 17:19:33.463964 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:19:33 crc kubenswrapper[4803]: I0320 17:19:33.762298 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:19:33 crc kubenswrapper[4803]: I0320 17:19:33.762632 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:19:34 crc kubenswrapper[4803]: I0320 17:19:34.146418 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:19:34 crc kubenswrapper[4803]: I0320 17:19:34.146474 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:19:34 crc kubenswrapper[4803]: I0320 17:19:34.801838 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t78tx" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="registry-server" probeResult="failure" output=< Mar 20 17:19:34 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:19:34 crc kubenswrapper[4803]: > Mar 20 17:19:35 crc kubenswrapper[4803]: I0320 17:19:35.084480 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsnl7"] Mar 20 17:19:35 crc kubenswrapper[4803]: I0320 17:19:35.084694 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vsnl7" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" containerName="registry-server" containerID="cri-o://f989f6268faff4205221804cd6257a28d0e974cb2b4a57ee487950413532d5d3" gracePeriod=2 Mar 20 17:19:35 crc kubenswrapper[4803]: I0320 17:19:35.183803 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8ncw7" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="registry-server" probeResult="failure" output=< Mar 20 17:19:35 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:19:35 crc kubenswrapper[4803]: > Mar 20 17:19:35 crc kubenswrapper[4803]: I0320 17:19:35.284437 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qf96s"] Mar 20 17:19:35 crc kubenswrapper[4803]: I0320 17:19:35.284665 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qf96s" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" containerName="registry-server" containerID="cri-o://f09bfeb23e0a3cafe001066489bdc72d15559f3e0000871bd094631a866b8124" gracePeriod=2 Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.333262 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d7689b4c-s77vf"] Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.333806 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" podUID="17fba358-cdfb-4ec0-9547-5352f9f35bd0" containerName="controller-manager" containerID="cri-o://bfa199081ac426342a619740bb7b4fbd109c20590df25107f84181e36c312032" gracePeriod=30 Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.420322 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449"] Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.420614 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" podUID="d7d21bfd-4a56-4665-a9db-e3558458f4bf" containerName="route-controller-manager" containerID="cri-o://7e41d00847fc912aec5a01c4f52241dbda823f8377f1803ed9e6c7ec119d0fd6" gracePeriod=30 Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.453814 4803 generic.go:334] "Generic (PLEG): container finished" podID="198f3e33-8f27-4975-8d32-caa8e52db976" containerID="f989f6268faff4205221804cd6257a28d0e974cb2b4a57ee487950413532d5d3" exitCode=0 Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.453959 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsnl7" event={"ID":"198f3e33-8f27-4975-8d32-caa8e52db976","Type":"ContainerDied","Data":"f989f6268faff4205221804cd6257a28d0e974cb2b4a57ee487950413532d5d3"} Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.453988 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vsnl7" event={"ID":"198f3e33-8f27-4975-8d32-caa8e52db976","Type":"ContainerDied","Data":"02f86a0e8ffc595fa97718bf0f8b9c8628903a49bc0cef8ff044e33facdb913b"} Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.454122 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02f86a0e8ffc595fa97718bf0f8b9c8628903a49bc0cef8ff044e33facdb913b" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.460919 4803 generic.go:334] "Generic (PLEG): container finished" podID="43d688d2-e15c-4129-8c1a-5b390313b012" containerID="f09bfeb23e0a3cafe001066489bdc72d15559f3e0000871bd094631a866b8124" exitCode=0 Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.460971 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf96s" event={"ID":"43d688d2-e15c-4129-8c1a-5b390313b012","Type":"ContainerDied","Data":"f09bfeb23e0a3cafe001066489bdc72d15559f3e0000871bd094631a866b8124"} Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.485582 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dnfh"] Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.485680 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.485776 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5dnfh" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerName="registry-server" containerID="cri-o://9dc4234426186ec2da8118b672f6ca6e607c504504d86a0dfc4bee9e37704015" gracePeriod=2 Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.621775 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-utilities\") pod \"198f3e33-8f27-4975-8d32-caa8e52db976\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.621825 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-catalog-content\") pod \"198f3e33-8f27-4975-8d32-caa8e52db976\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.621898 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss7z6\" (UniqueName: \"kubernetes.io/projected/198f3e33-8f27-4975-8d32-caa8e52db976-kube-api-access-ss7z6\") pod \"198f3e33-8f27-4975-8d32-caa8e52db976\" (UID: \"198f3e33-8f27-4975-8d32-caa8e52db976\") " Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.623513 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-utilities" (OuterVolumeSpecName: "utilities") pod "198f3e33-8f27-4975-8d32-caa8e52db976" (UID: "198f3e33-8f27-4975-8d32-caa8e52db976"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.629536 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/198f3e33-8f27-4975-8d32-caa8e52db976-kube-api-access-ss7z6" (OuterVolumeSpecName: "kube-api-access-ss7z6") pod "198f3e33-8f27-4975-8d32-caa8e52db976" (UID: "198f3e33-8f27-4975-8d32-caa8e52db976"). InnerVolumeSpecName "kube-api-access-ss7z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.658286 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.677443 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "198f3e33-8f27-4975-8d32-caa8e52db976" (UID: "198f3e33-8f27-4975-8d32-caa8e52db976"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.723612 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.723647 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/198f3e33-8f27-4975-8d32-caa8e52db976-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.723660 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss7z6\" (UniqueName: \"kubernetes.io/projected/198f3e33-8f27-4975-8d32-caa8e52db976-kube-api-access-ss7z6\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.824609 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-catalog-content\") pod \"43d688d2-e15c-4129-8c1a-5b390313b012\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.824649 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-utilities\") pod \"43d688d2-e15c-4129-8c1a-5b390313b012\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.824676 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wpqd\" (UniqueName: \"kubernetes.io/projected/43d688d2-e15c-4129-8c1a-5b390313b012-kube-api-access-8wpqd\") pod \"43d688d2-e15c-4129-8c1a-5b390313b012\" (UID: \"43d688d2-e15c-4129-8c1a-5b390313b012\") " Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.826027 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-utilities" (OuterVolumeSpecName: "utilities") pod "43d688d2-e15c-4129-8c1a-5b390313b012" (UID: "43d688d2-e15c-4129-8c1a-5b390313b012"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.830587 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43d688d2-e15c-4129-8c1a-5b390313b012-kube-api-access-8wpqd" (OuterVolumeSpecName: "kube-api-access-8wpqd") pod "43d688d2-e15c-4129-8c1a-5b390313b012" (UID: "43d688d2-e15c-4129-8c1a-5b390313b012"). InnerVolumeSpecName "kube-api-access-8wpqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.919642 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43d688d2-e15c-4129-8c1a-5b390313b012" (UID: "43d688d2-e15c-4129-8c1a-5b390313b012"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.925951 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.926005 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43d688d2-e15c-4129-8c1a-5b390313b012-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:37 crc kubenswrapper[4803]: I0320 17:19:37.926025 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wpqd\" (UniqueName: \"kubernetes.io/projected/43d688d2-e15c-4129-8c1a-5b390313b012-kube-api-access-8wpqd\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.478449 4803 generic.go:334] "Generic (PLEG): container finished" podID="17fba358-cdfb-4ec0-9547-5352f9f35bd0" containerID="bfa199081ac426342a619740bb7b4fbd109c20590df25107f84181e36c312032" exitCode=0 Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.478588 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" event={"ID":"17fba358-cdfb-4ec0-9547-5352f9f35bd0","Type":"ContainerDied","Data":"bfa199081ac426342a619740bb7b4fbd109c20590df25107f84181e36c312032"} Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.482848 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qf96s" event={"ID":"43d688d2-e15c-4129-8c1a-5b390313b012","Type":"ContainerDied","Data":"0c88ee11d2ae1423a68a257e93f1edcb66581e607380a74230cdf007d9feb318"} Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.482908 4803 scope.go:117] "RemoveContainer" containerID="f09bfeb23e0a3cafe001066489bdc72d15559f3e0000871bd094631a866b8124" Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.483020 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qf96s" Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.489140 4803 generic.go:334] "Generic (PLEG): container finished" podID="d7d21bfd-4a56-4665-a9db-e3558458f4bf" containerID="7e41d00847fc912aec5a01c4f52241dbda823f8377f1803ed9e6c7ec119d0fd6" exitCode=0 Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.489257 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" event={"ID":"d7d21bfd-4a56-4665-a9db-e3558458f4bf","Type":"ContainerDied","Data":"7e41d00847fc912aec5a01c4f52241dbda823f8377f1803ed9e6c7ec119d0fd6"} Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.489285 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vsnl7" Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.508964 4803 scope.go:117] "RemoveContainer" containerID="8f5c72410efc7f8d5fcffb7aca3b3b571492fb7f3f17cea69d7aa0b6fb8ed0ae" Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.525882 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qf96s"] Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.529488 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qf96s"] Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.553824 4803 scope.go:117] "RemoveContainer" containerID="3061a34269426f4684848f0238cba11b4e2e0b0259e3150500fc01d8eaaaaf69" Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.581150 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vsnl7"] Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.587509 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vsnl7"] Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.858749 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" path="/var/lib/kubelet/pods/198f3e33-8f27-4975-8d32-caa8e52db976/volumes" Mar 20 17:19:38 crc kubenswrapper[4803]: I0320 17:19:38.860034 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" path="/var/lib/kubelet/pods/43d688d2-e15c-4129-8c1a-5b390313b012/volumes" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.307333 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.449318 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-config\") pod \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.449403 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fba358-cdfb-4ec0-9547-5352f9f35bd0-serving-cert\") pod \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.449443 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-client-ca\") pod \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.449491 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-proxy-ca-bundles\") pod \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.449542 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8rkf\" (UniqueName: \"kubernetes.io/projected/17fba358-cdfb-4ec0-9547-5352f9f35bd0-kube-api-access-w8rkf\") pod \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\" (UID: \"17fba358-cdfb-4ec0-9547-5352f9f35bd0\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.450642 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-client-ca" (OuterVolumeSpecName: "client-ca") pod "17fba358-cdfb-4ec0-9547-5352f9f35bd0" (UID: "17fba358-cdfb-4ec0-9547-5352f9f35bd0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.450688 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "17fba358-cdfb-4ec0-9547-5352f9f35bd0" (UID: "17fba358-cdfb-4ec0-9547-5352f9f35bd0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.450767 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-config" (OuterVolumeSpecName: "config") pod "17fba358-cdfb-4ec0-9547-5352f9f35bd0" (UID: "17fba358-cdfb-4ec0-9547-5352f9f35bd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.469803 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17fba358-cdfb-4ec0-9547-5352f9f35bd0-kube-api-access-w8rkf" (OuterVolumeSpecName: "kube-api-access-w8rkf") pod "17fba358-cdfb-4ec0-9547-5352f9f35bd0" (UID: "17fba358-cdfb-4ec0-9547-5352f9f35bd0"). InnerVolumeSpecName "kube-api-access-w8rkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.470340 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17fba358-cdfb-4ec0-9547-5352f9f35bd0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17fba358-cdfb-4ec0-9547-5352f9f35bd0" (UID: "17fba358-cdfb-4ec0-9547-5352f9f35bd0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.501350 4803 generic.go:334] "Generic (PLEG): container finished" podID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerID="9dc4234426186ec2da8118b672f6ca6e607c504504d86a0dfc4bee9e37704015" exitCode=0 Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.501429 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dnfh" event={"ID":"21d3c1fc-3d92-45a2-aa9c-23f7407aa531","Type":"ContainerDied","Data":"9dc4234426186ec2da8118b672f6ca6e607c504504d86a0dfc4bee9e37704015"} Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.502662 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" event={"ID":"17fba358-cdfb-4ec0-9547-5352f9f35bd0","Type":"ContainerDied","Data":"c9d8c1c15744aa1e9f3e3587ec43e28d800f2dbed6158aed308592b337884e19"} Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.502721 4803 scope.go:117] "RemoveContainer" containerID="bfa199081ac426342a619740bb7b4fbd109c20590df25107f84181e36c312032" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.502988 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d7689b4c-s77vf" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.559847 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fba358-cdfb-4ec0-9547-5352f9f35bd0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.559897 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.559914 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.559929 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8rkf\" (UniqueName: \"kubernetes.io/projected/17fba358-cdfb-4ec0-9547-5352f9f35bd0-kube-api-access-w8rkf\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.559945 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fba358-cdfb-4ec0-9547-5352f9f35bd0-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.597953 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d7689b4c-s77vf"] Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.607150 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d7689b4c-s77vf"] Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.690030 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.862200 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxk7h\" (UniqueName: \"kubernetes.io/projected/d7d21bfd-4a56-4665-a9db-e3558458f4bf-kube-api-access-xxk7h\") pod \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.862313 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d21bfd-4a56-4665-a9db-e3558458f4bf-serving-cert\") pod \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.862608 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-client-ca\") pod \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.862727 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-config\") pod \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\" (UID: \"d7d21bfd-4a56-4665-a9db-e3558458f4bf\") " Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.863397 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-client-ca" (OuterVolumeSpecName: "client-ca") pod "d7d21bfd-4a56-4665-a9db-e3558458f4bf" (UID: "d7d21bfd-4a56-4665-a9db-e3558458f4bf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.863601 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-config" (OuterVolumeSpecName: "config") pod "d7d21bfd-4a56-4665-a9db-e3558458f4bf" (UID: "d7d21bfd-4a56-4665-a9db-e3558458f4bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.868143 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7d21bfd-4a56-4665-a9db-e3558458f4bf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7d21bfd-4a56-4665-a9db-e3558458f4bf" (UID: "d7d21bfd-4a56-4665-a9db-e3558458f4bf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.868146 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d21bfd-4a56-4665-a9db-e3558458f4bf-kube-api-access-xxk7h" (OuterVolumeSpecName: "kube-api-access-xxk7h") pod "d7d21bfd-4a56-4665-a9db-e3558458f4bf" (UID: "d7d21bfd-4a56-4665-a9db-e3558458f4bf"). InnerVolumeSpecName "kube-api-access-xxk7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.963811 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7d21bfd-4a56-4665-a9db-e3558458f4bf-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.963857 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.963870 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7d21bfd-4a56-4665-a9db-e3558458f4bf-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:39 crc kubenswrapper[4803]: I0320 17:19:39.963882 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxk7h\" (UniqueName: \"kubernetes.io/projected/d7d21bfd-4a56-4665-a9db-e3558458f4bf-kube-api-access-xxk7h\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.161019 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.267467 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-utilities\") pod \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.267549 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7ffc\" (UniqueName: \"kubernetes.io/projected/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-kube-api-access-n7ffc\") pod \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.267638 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-catalog-content\") pod \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\" (UID: \"21d3c1fc-3d92-45a2-aa9c-23f7407aa531\") " Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.268729 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-utilities" (OuterVolumeSpecName: "utilities") pod "21d3c1fc-3d92-45a2-aa9c-23f7407aa531" (UID: "21d3c1fc-3d92-45a2-aa9c-23f7407aa531"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.271864 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-kube-api-access-n7ffc" (OuterVolumeSpecName: "kube-api-access-n7ffc") pod "21d3c1fc-3d92-45a2-aa9c-23f7407aa531" (UID: "21d3c1fc-3d92-45a2-aa9c-23f7407aa531"). InnerVolumeSpecName "kube-api-access-n7ffc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.299726 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21d3c1fc-3d92-45a2-aa9c-23f7407aa531" (UID: "21d3c1fc-3d92-45a2-aa9c-23f7407aa531"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.369238 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.369268 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7ffc\" (UniqueName: \"kubernetes.io/projected/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-kube-api-access-n7ffc\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.369278 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21d3c1fc-3d92-45a2-aa9c-23f7407aa531-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.517879 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5dnfh" event={"ID":"21d3c1fc-3d92-45a2-aa9c-23f7407aa531","Type":"ContainerDied","Data":"9b3771d0550cf5ed649b95db1d193f394551dfe3c4b14958c137794b3d35b9a5"} Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.517938 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5dnfh" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.517961 4803 scope.go:117] "RemoveContainer" containerID="9dc4234426186ec2da8118b672f6ca6e607c504504d86a0dfc4bee9e37704015" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.519166 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" event={"ID":"d7d21bfd-4a56-4665-a9db-e3558458f4bf","Type":"ContainerDied","Data":"967da6aeec714198241a1eacb634159b3fe8884d0641331173b0b508044c7809"} Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.519226 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.537551 4803 scope.go:117] "RemoveContainer" containerID="c3ca2636e6377d51b6d427de6ad2d0a1c28d4f881f7c2e93cde69e68de9023fb" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.559917 4803 scope.go:117] "RemoveContainer" containerID="6369edd74cf75ba67be0ebef298eb3eba9632e4ffa87ed0906825392bd22c018" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.565874 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449"] Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.571118 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66fffff6fc-nh449"] Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.578826 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dnfh"] Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.584436 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5dnfh"] Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.595810 4803 scope.go:117] "RemoveContainer" containerID="7e41d00847fc912aec5a01c4f52241dbda823f8377f1803ed9e6c7ec119d0fd6" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.855191 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17fba358-cdfb-4ec0-9547-5352f9f35bd0" path="/var/lib/kubelet/pods/17fba358-cdfb-4ec0-9547-5352f9f35bd0/volumes" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.855696 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" path="/var/lib/kubelet/pods/21d3c1fc-3d92-45a2-aa9c-23f7407aa531/volumes" Mar 20 17:19:40 crc kubenswrapper[4803]: I0320 17:19:40.856273 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d21bfd-4a56-4665-a9db-e3558458f4bf" path="/var/lib/kubelet/pods/d7d21bfd-4a56-4665-a9db-e3558458f4bf/volumes" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.175105 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b8c575497-lsj7r"] Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.175953 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.175993 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176006 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" containerName="extract-utilities" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176016 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" containerName="extract-utilities" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176033 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96658fb9-4742-457e-b7ec-384ef06ec6a8" containerName="oc" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176043 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="96658fb9-4742-457e-b7ec-384ef06ec6a8" containerName="oc" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176060 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176068 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176085 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d21bfd-4a56-4665-a9db-e3558458f4bf" containerName="route-controller-manager" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176092 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d21bfd-4a56-4665-a9db-e3558458f4bf" containerName="route-controller-manager" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176105 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" containerName="extract-content" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176117 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" containerName="extract-content" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176126 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" containerName="extract-content" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176134 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" containerName="extract-content" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176152 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerName="extract-content" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176159 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerName="extract-content" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176175 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176182 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176199 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d426fdb-5c4f-4b52-a3db-c7e5a52ba627" containerName="pruner" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176206 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d426fdb-5c4f-4b52-a3db-c7e5a52ba627" containerName="pruner" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176226 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" containerName="extract-utilities" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176256 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" containerName="extract-utilities" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176274 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerName="extract-utilities" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176282 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerName="extract-utilities" Mar 20 17:19:41 crc kubenswrapper[4803]: E0320 17:19:41.176295 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fba358-cdfb-4ec0-9547-5352f9f35bd0" containerName="controller-manager" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176303 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fba358-cdfb-4ec0-9547-5352f9f35bd0" containerName="controller-manager" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176503 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="198f3e33-8f27-4975-8d32-caa8e52db976" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176541 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="17fba358-cdfb-4ec0-9547-5352f9f35bd0" containerName="controller-manager" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176551 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="43d688d2-e15c-4129-8c1a-5b390313b012" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176568 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="96658fb9-4742-457e-b7ec-384ef06ec6a8" containerName="oc" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176582 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d21bfd-4a56-4665-a9db-e3558458f4bf" containerName="route-controller-manager" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176605 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d426fdb-5c4f-4b52-a3db-c7e5a52ba627" containerName="pruner" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.176669 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d3c1fc-3d92-45a2-aa9c-23f7407aa531" containerName="registry-server" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.177313 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.181280 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.181617 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.181799 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.182050 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.182265 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.182400 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.183139 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql"] Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.184032 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.186871 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.187560 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.187632 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.187736 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.187786 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.188014 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.190458 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.200634 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql"] Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.203315 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b8c575497-lsj7r"] Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.279864 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-config\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.279916 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-serving-cert\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.280355 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-client-ca\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.280604 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9ckj\" (UniqueName: \"kubernetes.io/projected/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-kube-api-access-f9ckj\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.280744 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-proxy-ca-bundles\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.381824 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab886873-9ec5-48f3-aea1-dc39ab313aee-serving-cert\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.382002 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-config\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.382082 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-config\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.382124 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-serving-cert\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.382175 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-client-ca\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.382249 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-client-ca\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.382427 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9ckj\" (UniqueName: \"kubernetes.io/projected/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-kube-api-access-f9ckj\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.382470 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtfqp\" (UniqueName: \"kubernetes.io/projected/ab886873-9ec5-48f3-aea1-dc39ab313aee-kube-api-access-jtfqp\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.382505 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-proxy-ca-bundles\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.383346 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-config\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.383795 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-proxy-ca-bundles\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.383806 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-client-ca\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.402242 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-serving-cert\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.424891 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9ckj\" (UniqueName: \"kubernetes.io/projected/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-kube-api-access-f9ckj\") pod \"controller-manager-5b8c575497-lsj7r\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.482977 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtfqp\" (UniqueName: \"kubernetes.io/projected/ab886873-9ec5-48f3-aea1-dc39ab313aee-kube-api-access-jtfqp\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.483038 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab886873-9ec5-48f3-aea1-dc39ab313aee-serving-cert\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.483086 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-config\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.483139 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-client-ca\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.484080 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-client-ca\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.485161 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-config\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.486615 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab886873-9ec5-48f3-aea1-dc39ab313aee-serving-cert\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.506028 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtfqp\" (UniqueName: \"kubernetes.io/projected/ab886873-9ec5-48f3-aea1-dc39ab313aee-kube-api-access-jtfqp\") pod \"route-controller-manager-7f956fc54d-t9vql\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.507559 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.519188 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:41 crc kubenswrapper[4803]: I0320 17:19:41.766212 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b8c575497-lsj7r"] Mar 20 17:19:41 crc kubenswrapper[4803]: W0320 17:19:41.772375 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3ec1cd8_503c_493e_a3a5_6ff6c17b0862.slice/crio-d991283f3669cc94c354e07ac177f9d1d3a7e38cdce6b40dc9e059270714c33e WatchSource:0}: Error finding container d991283f3669cc94c354e07ac177f9d1d3a7e38cdce6b40dc9e059270714c33e: Status 404 returned error can't find the container with id d991283f3669cc94c354e07ac177f9d1d3a7e38cdce6b40dc9e059270714c33e Mar 20 17:19:42 crc kubenswrapper[4803]: I0320 17:19:42.045927 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql"] Mar 20 17:19:42 crc kubenswrapper[4803]: W0320 17:19:42.051117 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab886873_9ec5_48f3_aea1_dc39ab313aee.slice/crio-75b0e52c78ab620643d7743b0d34cb908b9cdf2ec9a9a8d30592656d64713ae6 WatchSource:0}: Error finding container 75b0e52c78ab620643d7743b0d34cb908b9cdf2ec9a9a8d30592656d64713ae6: Status 404 returned error can't find the container with id 75b0e52c78ab620643d7743b0d34cb908b9cdf2ec9a9a8d30592656d64713ae6 Mar 20 17:19:42 crc kubenswrapper[4803]: I0320 17:19:42.560895 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" event={"ID":"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862","Type":"ContainerStarted","Data":"ad0198c8aeda7a43682c4987775dc644f0b6a3d9d51417b913aa496c2ab7abfa"} Mar 20 17:19:42 crc kubenswrapper[4803]: I0320 17:19:42.561886 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" event={"ID":"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862","Type":"ContainerStarted","Data":"d991283f3669cc94c354e07ac177f9d1d3a7e38cdce6b40dc9e059270714c33e"} Mar 20 17:19:42 crc kubenswrapper[4803]: I0320 17:19:42.561963 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" event={"ID":"ab886873-9ec5-48f3-aea1-dc39ab313aee","Type":"ContainerStarted","Data":"75b0e52c78ab620643d7743b0d34cb908b9cdf2ec9a9a8d30592656d64713ae6"} Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.573098 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" event={"ID":"ab886873-9ec5-48f3-aea1-dc39ab313aee","Type":"ContainerStarted","Data":"96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b"} Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.573704 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.574321 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.582271 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.586715 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.609377 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" podStartSLOduration=6.609344987 podStartE2EDuration="6.609344987s" podCreationTimestamp="2026-03-20 17:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:19:43.598104896 +0000 UTC m=+193.509697076" watchObservedRunningTime="2026-03-20 17:19:43.609344987 +0000 UTC m=+193.520937127" Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.634745 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" podStartSLOduration=6.6347258799999995 podStartE2EDuration="6.63472588s" podCreationTimestamp="2026-03-20 17:19:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:19:43.632550172 +0000 UTC m=+193.544142312" watchObservedRunningTime="2026-03-20 17:19:43.63472588 +0000 UTC m=+193.546317960" Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.729115 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tsv6s"] Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.817497 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:19:43 crc kubenswrapper[4803]: I0320 17:19:43.861202 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:19:44 crc kubenswrapper[4803]: I0320 17:19:44.004394 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 17:19:44 crc kubenswrapper[4803]: I0320 17:19:44.193228 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:19:44 crc kubenswrapper[4803]: I0320 17:19:44.228449 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:19:47 crc kubenswrapper[4803]: I0320 17:19:47.892254 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8ncw7"] Mar 20 17:19:47 crc kubenswrapper[4803]: I0320 17:19:47.892678 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8ncw7" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="registry-server" containerID="cri-o://5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c" gracePeriod=2 Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.482092 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.586592 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plbpg\" (UniqueName: \"kubernetes.io/projected/7e906a75-4a60-419b-9248-89a5e14229f0-kube-api-access-plbpg\") pod \"7e906a75-4a60-419b-9248-89a5e14229f0\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.586689 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-catalog-content\") pod \"7e906a75-4a60-419b-9248-89a5e14229f0\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.586746 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-utilities\") pod \"7e906a75-4a60-419b-9248-89a5e14229f0\" (UID: \"7e906a75-4a60-419b-9248-89a5e14229f0\") " Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.588673 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-utilities" (OuterVolumeSpecName: "utilities") pod "7e906a75-4a60-419b-9248-89a5e14229f0" (UID: "7e906a75-4a60-419b-9248-89a5e14229f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.596443 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e906a75-4a60-419b-9248-89a5e14229f0-kube-api-access-plbpg" (OuterVolumeSpecName: "kube-api-access-plbpg") pod "7e906a75-4a60-419b-9248-89a5e14229f0" (UID: "7e906a75-4a60-419b-9248-89a5e14229f0"). InnerVolumeSpecName "kube-api-access-plbpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.607575 4803 generic.go:334] "Generic (PLEG): container finished" podID="7e906a75-4a60-419b-9248-89a5e14229f0" containerID="5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c" exitCode=0 Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.607641 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ncw7" event={"ID":"7e906a75-4a60-419b-9248-89a5e14229f0","Type":"ContainerDied","Data":"5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c"} Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.607693 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8ncw7" event={"ID":"7e906a75-4a60-419b-9248-89a5e14229f0","Type":"ContainerDied","Data":"4f3f7cb0b9175f66b806d52fabb0a3f8293dd05c323d6576c04801b20dbfead9"} Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.607722 4803 scope.go:117] "RemoveContainer" containerID="5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.607748 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8ncw7" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.637901 4803 scope.go:117] "RemoveContainer" containerID="7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.668157 4803 scope.go:117] "RemoveContainer" containerID="15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.687918 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plbpg\" (UniqueName: \"kubernetes.io/projected/7e906a75-4a60-419b-9248-89a5e14229f0-kube-api-access-plbpg\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.687953 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.692941 4803 scope.go:117] "RemoveContainer" containerID="5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c" Mar 20 17:19:48 crc kubenswrapper[4803]: E0320 17:19:48.693556 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c\": container with ID starting with 5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c not found: ID does not exist" containerID="5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.693599 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c"} err="failed to get container status \"5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c\": rpc error: code = NotFound desc = could not find container \"5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c\": container with ID starting with 5fe66f99b1ed41b20f501b2ed9784b99dfb58ad95c3e71db1db9cd7ffdeb615c not found: ID does not exist" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.693624 4803 scope.go:117] "RemoveContainer" containerID="7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19" Mar 20 17:19:48 crc kubenswrapper[4803]: E0320 17:19:48.694128 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19\": container with ID starting with 7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19 not found: ID does not exist" containerID="7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.694158 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19"} err="failed to get container status \"7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19\": rpc error: code = NotFound desc = could not find container \"7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19\": container with ID starting with 7dd4e14fee98a2472b38e81dfb1d1caad1b9d6cf2c61935d9bf2415e43621b19 not found: ID does not exist" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.694176 4803 scope.go:117] "RemoveContainer" containerID="15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512" Mar 20 17:19:48 crc kubenswrapper[4803]: E0320 17:19:48.694581 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512\": container with ID starting with 15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512 not found: ID does not exist" containerID="15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.694607 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512"} err="failed to get container status \"15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512\": rpc error: code = NotFound desc = could not find container \"15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512\": container with ID starting with 15e30763a6cfef6702c1a6fab2490d49d72fee00fb009539111a825108188512 not found: ID does not exist" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.767900 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e906a75-4a60-419b-9248-89a5e14229f0" (UID: "7e906a75-4a60-419b-9248-89a5e14229f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.789998 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e906a75-4a60-419b-9248-89a5e14229f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.945721 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8ncw7"] Mar 20 17:19:48 crc kubenswrapper[4803]: I0320 17:19:48.949639 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8ncw7"] Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.177903 4803 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.178280 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="registry-server" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.178304 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="registry-server" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.178317 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="extract-utilities" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.178325 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="extract-utilities" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.178334 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="extract-content" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.178342 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="extract-content" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.178465 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" containerName="registry-server" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.179028 4803 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.179239 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.179471 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1" gracePeriod=15 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.179671 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a" gracePeriod=15 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.179729 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc" gracePeriod=15 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.179773 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77" gracePeriod=15 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.179858 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a" gracePeriod=15 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.183951 4803 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184103 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184114 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184125 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184131 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184140 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184146 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184157 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184163 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184173 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184180 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184190 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184197 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184206 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184212 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184219 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184225 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.184234 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184240 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184380 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184390 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184396 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184404 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184483 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184495 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184502 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.184509 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.247336 4803 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.300619 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.300826 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.300860 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.300921 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.300947 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.300980 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.300994 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.301012 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.402816 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.402889 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.402935 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.402986 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403032 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403072 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403039 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403086 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403055 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403118 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403207 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403252 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403286 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403330 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403397 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.403487 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.548848 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:49 crc kubenswrapper[4803]: W0320 17:19:49.578389 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-f583f6ff46c2cfdceba719177adb96efc5a13a0650447bce248e29cbca2d5368 WatchSource:0}: Error finding container f583f6ff46c2cfdceba719177adb96efc5a13a0650447bce248e29cbca2d5368: Status 404 returned error can't find the container with id f583f6ff46c2cfdceba719177adb96efc5a13a0650447bce248e29cbca2d5368 Mar 20 17:19:49 crc kubenswrapper[4803]: E0320 17:19:49.582389 4803 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c5468e71fd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:19:49.581717463 +0000 UTC m=+199.493309563,LastTimestamp:2026-03-20 17:19:49.581717463 +0000 UTC m=+199.493309563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.619675 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f583f6ff46c2cfdceba719177adb96efc5a13a0650447bce248e29cbca2d5368"} Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.623058 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d349bec2-b61d-40f3-9331-580aac5a4d4d","Type":"ContainerDied","Data":"7bd2383ce333476283702b71154dfdab9f10b6c61dd0f2426307d18eed4e857b"} Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.623075 4803 generic.go:334] "Generic (PLEG): container finished" podID="d349bec2-b61d-40f3-9331-580aac5a4d4d" containerID="7bd2383ce333476283702b71154dfdab9f10b6c61dd0f2426307d18eed4e857b" exitCode=0 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.624216 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.625384 4803 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.628949 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.631060 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.633212 4803 scope.go:117] "RemoveContainer" containerID="02d150e248bcae57045b73be5976cd4e7f89d2bab797c3a4a31ab3344a9deb5d" Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.633626 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a" exitCode=0 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.633697 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc" exitCode=0 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.633708 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77" exitCode=0 Mar 20 17:19:49 crc kubenswrapper[4803]: I0320 17:19:49.633719 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a" exitCode=2 Mar 20 17:19:50 crc kubenswrapper[4803]: I0320 17:19:50.645879 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:19:50 crc kubenswrapper[4803]: I0320 17:19:50.651178 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb"} Mar 20 17:19:50 crc kubenswrapper[4803]: I0320 17:19:50.652377 4803 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:50 crc kubenswrapper[4803]: I0320 17:19:50.653115 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:50 crc kubenswrapper[4803]: E0320 17:19:50.653401 4803 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:50 crc kubenswrapper[4803]: I0320 17:19:50.850483 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:50 crc kubenswrapper[4803]: I0320 17:19:50.851117 4803 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:50 crc kubenswrapper[4803]: I0320 17:19:50.857681 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e906a75-4a60-419b-9248-89a5e14229f0" path="/var/lib/kubelet/pods/7e906a75-4a60-419b-9248-89a5e14229f0/volumes" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.190961 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.191855 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.339064 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d349bec2-b61d-40f3-9331-580aac5a4d4d-kube-api-access\") pod \"d349bec2-b61d-40f3-9331-580aac5a4d4d\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.339152 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-kubelet-dir\") pod \"d349bec2-b61d-40f3-9331-580aac5a4d4d\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.339237 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-var-lock\") pod \"d349bec2-b61d-40f3-9331-580aac5a4d4d\" (UID: \"d349bec2-b61d-40f3-9331-580aac5a4d4d\") " Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.339326 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d349bec2-b61d-40f3-9331-580aac5a4d4d" (UID: "d349bec2-b61d-40f3-9331-580aac5a4d4d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.339381 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-var-lock" (OuterVolumeSpecName: "var-lock") pod "d349bec2-b61d-40f3-9331-580aac5a4d4d" (UID: "d349bec2-b61d-40f3-9331-580aac5a4d4d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.339835 4803 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.339871 4803 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d349bec2-b61d-40f3-9331-580aac5a4d4d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.356019 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d349bec2-b61d-40f3-9331-580aac5a4d4d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d349bec2-b61d-40f3-9331-580aac5a4d4d" (UID: "d349bec2-b61d-40f3-9331-580aac5a4d4d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.440897 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d349bec2-b61d-40f3-9331-580aac5a4d4d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:51 crc kubenswrapper[4803]: E0320 17:19:51.570746 4803 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c5468e71fd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:19:49.581717463 +0000 UTC m=+199.493309563,LastTimestamp:2026-03-20 17:19:49.581717463 +0000 UTC m=+199.493309563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.592036 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.594080 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.594985 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.595568 4803 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.663301 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d349bec2-b61d-40f3-9331-580aac5a4d4d","Type":"ContainerDied","Data":"43d0e7ca4d5414126bf1a8fa313030f40772857b7899656cab378d9eb4ab6ed0"} Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.663365 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.663374 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d0e7ca4d5414126bf1a8fa313030f40772857b7899656cab378d9eb4ab6ed0" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.668036 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.669381 4803 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1" exitCode=0 Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.669480 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.669618 4803 scope.go:117] "RemoveContainer" containerID="3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a" Mar 20 17:19:51 crc kubenswrapper[4803]: E0320 17:19:51.670271 4803 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.689434 4803 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.690051 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.697131 4803 scope.go:117] "RemoveContainer" containerID="faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.717889 4803 scope.go:117] "RemoveContainer" containerID="eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.738851 4803 scope.go:117] "RemoveContainer" containerID="4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.749342 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.749435 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.749499 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.749630 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.749643 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.749701 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.750275 4803 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.750314 4803 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.750333 4803 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.760795 4803 scope.go:117] "RemoveContainer" containerID="cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.782397 4803 scope.go:117] "RemoveContainer" containerID="51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.801303 4803 scope.go:117] "RemoveContainer" containerID="3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a" Mar 20 17:19:51 crc kubenswrapper[4803]: E0320 17:19:51.801709 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\": container with ID starting with 3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a not found: ID does not exist" containerID="3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.801770 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a"} err="failed to get container status \"3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\": rpc error: code = NotFound desc = could not find container \"3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a\": container with ID starting with 3d15888e5b043cc67fcbda9a06c0900c999614ca44c969dd43aa22422de5383a not found: ID does not exist" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.801823 4803 scope.go:117] "RemoveContainer" containerID="faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc" Mar 20 17:19:51 crc kubenswrapper[4803]: E0320 17:19:51.802385 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\": container with ID starting with faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc not found: ID does not exist" containerID="faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.802463 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc"} err="failed to get container status \"faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\": rpc error: code = NotFound desc = could not find container \"faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc\": container with ID starting with faa550874db46ef8dee9b6d3d9c5f58d248e3c463b6e5082e00a54ecdbb65bdc not found: ID does not exist" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.802511 4803 scope.go:117] "RemoveContainer" containerID="eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77" Mar 20 17:19:51 crc kubenswrapper[4803]: E0320 17:19:51.802934 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\": container with ID starting with eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77 not found: ID does not exist" containerID="eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.802979 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77"} err="failed to get container status \"eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\": rpc error: code = NotFound desc = could not find container \"eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77\": container with ID starting with eeab52f686476aa7d7af960ef03c81c815cf72d4f3dc41b1eeb2021535a19f77 not found: ID does not exist" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.803006 4803 scope.go:117] "RemoveContainer" containerID="4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a" Mar 20 17:19:51 crc kubenswrapper[4803]: E0320 17:19:51.803375 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\": container with ID starting with 4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a not found: ID does not exist" containerID="4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.803403 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a"} err="failed to get container status \"4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\": rpc error: code = NotFound desc = could not find container \"4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a\": container with ID starting with 4446066db6db2fb0d8af74022893991be0cdf1a9f6a71d8f45273098cfc3443a not found: ID does not exist" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.803424 4803 scope.go:117] "RemoveContainer" containerID="cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1" Mar 20 17:19:51 crc kubenswrapper[4803]: E0320 17:19:51.803874 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\": container with ID starting with cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1 not found: ID does not exist" containerID="cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.803930 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1"} err="failed to get container status \"cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\": rpc error: code = NotFound desc = could not find container \"cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1\": container with ID starting with cb0a8a1a1758bc76b9932adccc478863421448cc166f5c3efdbc28b1ab19c5c1 not found: ID does not exist" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.803968 4803 scope.go:117] "RemoveContainer" containerID="51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908" Mar 20 17:19:51 crc kubenswrapper[4803]: E0320 17:19:51.804497 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\": container with ID starting with 51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908 not found: ID does not exist" containerID="51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.804517 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908"} err="failed to get container status \"51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\": rpc error: code = NotFound desc = could not find container \"51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908\": container with ID starting with 51c946f89e072e7736f8058009821f7da4602cf89f35d4b8349d729877149908 not found: ID does not exist" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.995356 4803 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:51 crc kubenswrapper[4803]: I0320 17:19:51.995982 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:52 crc kubenswrapper[4803]: I0320 17:19:52.868562 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 17:19:52 crc kubenswrapper[4803]: E0320 17:19:52.893694 4803 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:52 crc kubenswrapper[4803]: E0320 17:19:52.894515 4803 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:52 crc kubenswrapper[4803]: E0320 17:19:52.894845 4803 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:52 crc kubenswrapper[4803]: E0320 17:19:52.895227 4803 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:52 crc kubenswrapper[4803]: E0320 17:19:52.895565 4803 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:52 crc kubenswrapper[4803]: I0320 17:19:52.895590 4803 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 17:19:52 crc kubenswrapper[4803]: E0320 17:19:52.895906 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Mar 20 17:19:53 crc kubenswrapper[4803]: E0320 17:19:53.097198 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Mar 20 17:19:53 crc kubenswrapper[4803]: E0320 17:19:53.498765 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Mar 20 17:19:53 crc kubenswrapper[4803]: E0320 17:19:53.914100 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:53Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:53Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:53Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T17:19:53Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:53 crc kubenswrapper[4803]: E0320 17:19:53.914849 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:53 crc kubenswrapper[4803]: E0320 17:19:53.915400 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:53 crc kubenswrapper[4803]: E0320 17:19:53.915940 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:53 crc kubenswrapper[4803]: E0320 17:19:53.916472 4803 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:19:53 crc kubenswrapper[4803]: E0320 17:19:53.916512 4803 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 17:19:54 crc kubenswrapper[4803]: E0320 17:19:54.300206 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Mar 20 17:19:55 crc kubenswrapper[4803]: E0320 17:19:55.901897 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Mar 20 17:19:59 crc kubenswrapper[4803]: E0320 17:19:59.103667 4803 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="6.4s" Mar 20 17:20:00 crc kubenswrapper[4803]: I0320 17:20:00.852496 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:20:00 crc kubenswrapper[4803]: E0320 17:20:00.943997 4803 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" volumeName="registry-storage" Mar 20 17:20:01 crc kubenswrapper[4803]: E0320 17:20:01.571560 4803 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e9c5468e71fd7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 17:19:49.581717463 +0000 UTC m=+199.493309563,LastTimestamp:2026-03-20 17:19:49.581717463 +0000 UTC m=+199.493309563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 17:20:01 crc kubenswrapper[4803]: I0320 17:20:01.847542 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:01 crc kubenswrapper[4803]: I0320 17:20:01.848787 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:20:01 crc kubenswrapper[4803]: I0320 17:20:01.859737 4803 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:01 crc kubenswrapper[4803]: I0320 17:20:01.859772 4803 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:01 crc kubenswrapper[4803]: E0320 17:20:01.860254 4803 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:01 crc kubenswrapper[4803]: I0320 17:20:01.861058 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.828229 4803 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="d3c5e84f405ca8566bc005781d44613df4adf65cf652bccad10aedc9d213a993" exitCode=0 Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.828351 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"d3c5e84f405ca8566bc005781d44613df4adf65cf652bccad10aedc9d213a993"} Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.828884 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc11110c4416e34ddcaaf077abc6336fb3a3aef59491a19eecc136925c3778de"} Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.829174 4803 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.829189 4803 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:02 crc kubenswrapper[4803]: E0320 17:20:02.829920 4803 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.829925 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.833126 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.833973 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.834043 4803 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="93126eed679cf9381d281ba5f85387fecfe1fa5e889e7302ba7b6378fe1c074a" exitCode=1 Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.834087 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"93126eed679cf9381d281ba5f85387fecfe1fa5e889e7302ba7b6378fe1c074a"} Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.834669 4803 scope.go:117] "RemoveContainer" containerID="93126eed679cf9381d281ba5f85387fecfe1fa5e889e7302ba7b6378fe1c074a" Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.835389 4803 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:20:02 crc kubenswrapper[4803]: I0320 17:20:02.836234 4803 status_manager.go:851] "Failed to get status for pod" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 17:20:03 crc kubenswrapper[4803]: I0320 17:20:03.850646 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:20:03 crc kubenswrapper[4803]: I0320 17:20:03.852466 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:20:03 crc kubenswrapper[4803]: I0320 17:20:03.852591 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"05b3ae066c0990823374b67ded7e4582bd6e27657b0fd707525ceda3c223c3fa"} Mar 20 17:20:03 crc kubenswrapper[4803]: I0320 17:20:03.855343 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"716d45c2feb0975a66474d9f3615ee592b76ad63070738c2ae4cc7e10a248275"} Mar 20 17:20:03 crc kubenswrapper[4803]: I0320 17:20:03.855383 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8ec9843959b704d8eddc39351390ac105c07a5a5f0a90cc0e5d39be72e4abd82"} Mar 20 17:20:03 crc kubenswrapper[4803]: I0320 17:20:03.855398 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a6283947c8c8eaed96229653ced263dc90a119db9163e9fd24e87d2f83686543"} Mar 20 17:20:04 crc kubenswrapper[4803]: I0320 17:20:04.861786 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"81d7aff46e19e18fc86e11fb45130f51b36d98d96b313312e15baf2065e57169"} Mar 20 17:20:04 crc kubenswrapper[4803]: I0320 17:20:04.862020 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dfe5970331eb248977c64dcb20b96cde218ca08400df0a297e9a9cdc89636dcf"} Mar 20 17:20:04 crc kubenswrapper[4803]: I0320 17:20:04.862212 4803 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:04 crc kubenswrapper[4803]: I0320 17:20:04.862226 4803 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:04 crc kubenswrapper[4803]: I0320 17:20:04.862405 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:06 crc kubenswrapper[4803]: I0320 17:20:06.862578 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:06 crc kubenswrapper[4803]: I0320 17:20:06.864021 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:06 crc kubenswrapper[4803]: I0320 17:20:06.870328 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:08 crc kubenswrapper[4803]: I0320 17:20:08.246220 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:20:08 crc kubenswrapper[4803]: I0320 17:20:08.246310 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:20:08 crc kubenswrapper[4803]: I0320 17:20:08.753615 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" podUID="baff85ab-57bf-49c5-8009-938ad47246aa" containerName="oauth-openshift" containerID="cri-o://adc48f9207c753a1ebbfcae066f5e56440c200be25054b311a061c49711ac3a1" gracePeriod=15 Mar 20 17:20:08 crc kubenswrapper[4803]: I0320 17:20:08.907171 4803 generic.go:334] "Generic (PLEG): container finished" podID="baff85ab-57bf-49c5-8009-938ad47246aa" containerID="adc48f9207c753a1ebbfcae066f5e56440c200be25054b311a061c49711ac3a1" exitCode=0 Mar 20 17:20:08 crc kubenswrapper[4803]: I0320 17:20:08.907244 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" event={"ID":"baff85ab-57bf-49c5-8009-938ad47246aa","Type":"ContainerDied","Data":"adc48f9207c753a1ebbfcae066f5e56440c200be25054b311a061c49711ac3a1"} Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.272213 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.428839 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-ocp-branding-template\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.428905 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-login\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.428967 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-provider-selection\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429005 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-router-certs\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429059 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-service-ca\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429096 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-idp-0-file-data\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429132 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/baff85ab-57bf-49c5-8009-938ad47246aa-audit-dir\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429169 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-trusted-ca-bundle\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429203 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-audit-policies\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429258 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-serving-cert\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429296 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-session\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429275 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baff85ab-57bf-49c5-8009-938ad47246aa-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429359 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgql4\" (UniqueName: \"kubernetes.io/projected/baff85ab-57bf-49c5-8009-938ad47246aa-kube-api-access-tgql4\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429396 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-error\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429431 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-cliconfig\") pod \"baff85ab-57bf-49c5-8009-938ad47246aa\" (UID: \"baff85ab-57bf-49c5-8009-938ad47246aa\") " Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.429730 4803 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/baff85ab-57bf-49c5-8009-938ad47246aa-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.430142 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.430386 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.430442 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.431265 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.435016 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.436847 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.436966 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baff85ab-57bf-49c5-8009-938ad47246aa-kube-api-access-tgql4" (OuterVolumeSpecName: "kube-api-access-tgql4") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "kube-api-access-tgql4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.437064 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.437367 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.441797 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.445956 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.446325 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.446780 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "baff85ab-57bf-49c5-8009-938ad47246aa" (UID: "baff85ab-57bf-49c5-8009-938ad47246aa"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534042 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534085 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534098 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534112 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534124 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534136 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534150 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534161 4803 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534175 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534187 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534199 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgql4\" (UniqueName: \"kubernetes.io/projected/baff85ab-57bf-49c5-8009-938ad47246aa-kube-api-access-tgql4\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534210 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.534225 4803 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/baff85ab-57bf-49c5-8009-938ad47246aa-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.871511 4803 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.916919 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" event={"ID":"baff85ab-57bf-49c5-8009-938ad47246aa","Type":"ContainerDied","Data":"2891369a309be11942ff09790fd616ec63f368564408f0939d12bbc223c395a8"} Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.916986 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tsv6s" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.917008 4803 scope.go:117] "RemoveContainer" containerID="adc48f9207c753a1ebbfcae066f5e56440c200be25054b311a061c49711ac3a1" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.917763 4803 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.917872 4803 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:09 crc kubenswrapper[4803]: I0320 17:20:09.922640 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:10 crc kubenswrapper[4803]: I0320 17:20:10.625435 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:20:10 crc kubenswrapper[4803]: I0320 17:20:10.864129 4803 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="64a1f214-345b-413f-922e-6fc3004ecc22" Mar 20 17:20:10 crc kubenswrapper[4803]: I0320 17:20:10.921670 4803 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:10 crc kubenswrapper[4803]: I0320 17:20:10.921704 4803 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="94a7cc13-e3ab-4d94-91e5-fb2d1607f3ba" Mar 20 17:20:10 crc kubenswrapper[4803]: I0320 17:20:10.925356 4803 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="64a1f214-345b-413f-922e-6fc3004ecc22" Mar 20 17:20:12 crc kubenswrapper[4803]: I0320 17:20:12.099943 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:20:12 crc kubenswrapper[4803]: I0320 17:20:12.100045 4803 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:20:12 crc kubenswrapper[4803]: I0320 17:20:12.100482 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:20:19 crc kubenswrapper[4803]: I0320 17:20:19.292138 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 17:20:20 crc kubenswrapper[4803]: I0320 17:20:20.277777 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:20:20 crc kubenswrapper[4803]: I0320 17:20:20.622005 4803 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 17:20:20 crc kubenswrapper[4803]: I0320 17:20:20.664251 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 17:20:20 crc kubenswrapper[4803]: I0320 17:20:20.698915 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 17:20:20 crc kubenswrapper[4803]: I0320 17:20:20.821811 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 17:20:21 crc kubenswrapper[4803]: I0320 17:20:21.477984 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 17:20:21 crc kubenswrapper[4803]: I0320 17:20:21.492143 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 17:20:21 crc kubenswrapper[4803]: I0320 17:20:21.793140 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 17:20:21 crc kubenswrapper[4803]: I0320 17:20:21.803086 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 17:20:21 crc kubenswrapper[4803]: I0320 17:20:21.816403 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 17:20:21 crc kubenswrapper[4803]: I0320 17:20:21.943845 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 17:20:21 crc kubenswrapper[4803]: I0320 17:20:21.998131 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 17:20:22 crc kubenswrapper[4803]: I0320 17:20:22.100384 4803 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:20:22 crc kubenswrapper[4803]: I0320 17:20:22.100461 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:20:22 crc kubenswrapper[4803]: I0320 17:20:22.295730 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 17:20:22 crc kubenswrapper[4803]: I0320 17:20:22.560990 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 17:20:22 crc kubenswrapper[4803]: I0320 17:20:22.567192 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 17:20:22 crc kubenswrapper[4803]: I0320 17:20:22.601335 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 17:20:22 crc kubenswrapper[4803]: I0320 17:20:22.849103 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 17:20:22 crc kubenswrapper[4803]: I0320 17:20:22.952200 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.025242 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.056385 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.073558 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.145254 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.167196 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.272382 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.316865 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.491451 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.544846 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 17:20:23 crc kubenswrapper[4803]: I0320 17:20:23.807148 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.019729 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.051893 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.130863 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.202142 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.380691 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.382666 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.720861 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.848600 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.865642 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.927149 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:20:24 crc kubenswrapper[4803]: I0320 17:20:24.966961 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.188002 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.219107 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.371195 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.394308 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.428730 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.450199 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.481210 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.560583 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.652678 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.704676 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.750731 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.813402 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.900012 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 17:20:25 crc kubenswrapper[4803]: I0320 17:20:25.938193 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.010285 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.181029 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.297302 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.322168 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.340211 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.394911 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.511243 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.535333 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.536976 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.538493 4803 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.543638 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.546998 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-tsv6s"] Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.547117 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.560434 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.577299 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=17.57727741 podStartE2EDuration="17.57727741s" podCreationTimestamp="2026-03-20 17:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:26.574202214 +0000 UTC m=+236.485794304" watchObservedRunningTime="2026-03-20 17:20:26.57727741 +0000 UTC m=+236.488869510" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.605083 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.615977 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.628032 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.669227 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.691718 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.712946 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.751723 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.756400 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.860254 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baff85ab-57bf-49c5-8009-938ad47246aa" path="/var/lib/kubelet/pods/baff85ab-57bf-49c5-8009-938ad47246aa/volumes" Mar 20 17:20:26 crc kubenswrapper[4803]: I0320 17:20:26.956476 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.027145 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.138965 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.324511 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.330306 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.366117 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.463020 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.637682 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.643132 4803 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.668643 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.772941 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-57ccb4dddc-96vgw"] Mar 20 17:20:27 crc kubenswrapper[4803]: E0320 17:20:27.773171 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" containerName="installer" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.773186 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" containerName="installer" Mar 20 17:20:27 crc kubenswrapper[4803]: E0320 17:20:27.773204 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baff85ab-57bf-49c5-8009-938ad47246aa" containerName="oauth-openshift" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.773212 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="baff85ab-57bf-49c5-8009-938ad47246aa" containerName="oauth-openshift" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.773335 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="baff85ab-57bf-49c5-8009-938ad47246aa" containerName="oauth-openshift" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.773349 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d349bec2-b61d-40f3-9331-580aac5a4d4d" containerName="installer" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.773806 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.776198 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.776663 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.777666 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.778150 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.778570 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.778850 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.779041 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.779363 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.779386 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.779571 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.779633 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.782844 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.794993 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.798423 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.800396 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.804871 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.847661 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.862834 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.912584 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.912676 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.912740 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.912767 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.912815 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.912887 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-audit-dir\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.912916 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-session\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.912939 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-login\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.913580 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wbl2\" (UniqueName: \"kubernetes.io/projected/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-kube-api-access-5wbl2\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.913646 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.913679 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.913729 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-error\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.914022 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.914052 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-audit-policies\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.931050 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.953847 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:20:27 crc kubenswrapper[4803]: I0320 17:20:27.960278 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015074 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-error\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015165 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015209 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-audit-policies\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015283 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015325 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015365 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015398 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015429 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015483 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-audit-dir\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015548 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-session\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015604 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-login\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015659 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wbl2\" (UniqueName: \"kubernetes.io/projected/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-kube-api-access-5wbl2\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.017042 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.017107 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.015796 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-audit-dir\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.018075 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.018934 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.019609 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-audit-policies\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.020004 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.024841 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-error\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.025186 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.025306 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.025960 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.026300 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-system-session\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.029128 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.029654 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.030227 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.031869 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-v4-0-config-user-template-login\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.044263 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wbl2\" (UniqueName: \"kubernetes.io/projected/16c7c6f0-a14b-4f75-b0a7-2ccd300881a5-kube-api-access-5wbl2\") pod \"oauth-openshift-57ccb4dddc-96vgw\" (UID: \"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5\") " pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.049188 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.098297 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.226078 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.387490 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.509962 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.520519 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.522220 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.535975 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.543748 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.549941 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.567858 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.614435 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.636108 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.709199 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.750952 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.764994 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.903767 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.908307 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.965839 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.970387 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.977404 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 17:20:28 crc kubenswrapper[4803]: I0320 17:20:28.979785 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.048648 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.051478 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.085202 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.160353 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.173999 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.213351 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.275784 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.355308 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.377817 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.379683 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.425004 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.455184 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.651086 4803 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.658000 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.699160 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.747697 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.868859 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.904381 4803 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.958024 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 17:20:29 crc kubenswrapper[4803]: I0320 17:20:29.960797 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.044346 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.046999 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.052187 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.063297 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.143124 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.202113 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.281671 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.283678 4803 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.309271 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.362749 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.378485 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.438281 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.526412 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.609605 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.629683 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.691902 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.711341 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57ccb4dddc-96vgw"] Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.741177 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.753252 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.800284 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.858567 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.901188 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.982890 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 17:20:30 crc kubenswrapper[4803]: I0320 17:20:30.994899 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.007994 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.013245 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.066680 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.100316 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.149359 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.196783 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.200003 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.201425 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.227359 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.259047 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-57ccb4dddc-96vgw"] Mar 20 17:20:31 crc kubenswrapper[4803]: W0320 17:20:31.263745 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16c7c6f0_a14b_4f75_b0a7_2ccd300881a5.slice/crio-b77bafa444f7e12d43a5e7258081b75784f5a9f6a83f2dc9d9596b41493b7136 WatchSource:0}: Error finding container b77bafa444f7e12d43a5e7258081b75784f5a9f6a83f2dc9d9596b41493b7136: Status 404 returned error can't find the container with id b77bafa444f7e12d43a5e7258081b75784f5a9f6a83f2dc9d9596b41493b7136 Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.294042 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.320672 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.383318 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.388823 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.439922 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.479031 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.621219 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.627284 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.848915 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.877950 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.889659 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.925087 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 17:20:31 crc kubenswrapper[4803]: I0320 17:20:31.965049 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.026483 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.027160 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.067832 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.074327 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" event={"ID":"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5","Type":"ContainerStarted","Data":"0f120107fba6af95de16c70c8049e3e2b0ac5c76bcea9198082a88f008e48958"} Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.074398 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" event={"ID":"16c7c6f0-a14b-4f75-b0a7-2ccd300881a5","Type":"ContainerStarted","Data":"b77bafa444f7e12d43a5e7258081b75784f5a9f6a83f2dc9d9596b41493b7136"} Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.074656 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.092025 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.100014 4803 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.100082 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.100146 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.100970 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"05b3ae066c0990823374b67ded7e4582bd6e27657b0fd707525ceda3c223c3fa"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.101160 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://05b3ae066c0990823374b67ded7e4582bd6e27657b0fd707525ceda3c223c3fa" gracePeriod=30 Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.106631 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" podStartSLOduration=49.106608112 podStartE2EDuration="49.106608112s" podCreationTimestamp="2026-03-20 17:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:20:32.099171168 +0000 UTC m=+242.010763298" watchObservedRunningTime="2026-03-20 17:20:32.106608112 +0000 UTC m=+242.018200192" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.108944 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.116939 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.131493 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.170752 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.261431 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.283307 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.314167 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-57ccb4dddc-96vgw" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.408225 4803 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.408428 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb" gracePeriod=5 Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.491077 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.603232 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.603347 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.695304 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.718254 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.739902 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.749676 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 17:20:32 crc kubenswrapper[4803]: I0320 17:20:32.800375 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.047777 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.137340 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.146858 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.185695 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.202237 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.223547 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.295255 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.307563 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.336607 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.460041 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.526433 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.545832 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.572871 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.743251 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:20:33 crc kubenswrapper[4803]: I0320 17:20:33.813293 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.131439 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.173794 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.176850 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.184796 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.278712 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.404791 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.416761 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.447568 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.531960 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.559943 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 17:20:34 crc kubenswrapper[4803]: I0320 17:20:34.958977 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.072603 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.151201 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.429100 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.497199 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.667689 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.690433 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.775903 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.788785 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 17:20:35 crc kubenswrapper[4803]: I0320 17:20:35.843331 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 17:20:36 crc kubenswrapper[4803]: I0320 17:20:36.263620 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 17:20:36 crc kubenswrapper[4803]: I0320 17:20:36.321459 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 17:20:36 crc kubenswrapper[4803]: I0320 17:20:36.376983 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 17:20:36 crc kubenswrapper[4803]: I0320 17:20:36.580922 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 17:20:36 crc kubenswrapper[4803]: I0320 17:20:36.599484 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 17:20:36 crc kubenswrapper[4803]: I0320 17:20:36.753846 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 17:20:36 crc kubenswrapper[4803]: I0320 17:20:36.938729 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 17:20:37 crc kubenswrapper[4803]: I0320 17:20:37.274563 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.010986 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.011350 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.067596 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.067691 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.067744 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.067772 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.067787 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.067827 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.067840 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.067929 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.068020 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.068207 4803 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.068227 4803 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.068245 4803 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.068261 4803 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.078554 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.129389 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.129449 4803 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb" exitCode=137 Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.129497 4803 scope.go:117] "RemoveContainer" containerID="cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.130109 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.152376 4803 scope.go:117] "RemoveContainer" containerID="cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb" Mar 20 17:20:38 crc kubenswrapper[4803]: E0320 17:20:38.152762 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb\": container with ID starting with cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb not found: ID does not exist" containerID="cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.152805 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb"} err="failed to get container status \"cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb\": rpc error: code = NotFound desc = could not find container \"cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb\": container with ID starting with cffda65c1aa6d21c8cba2d02ab6298b552233ddd443928b781b29096169ca6eb not found: ID does not exist" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.169075 4803 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.246662 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.246797 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:20:38 crc kubenswrapper[4803]: I0320 17:20:38.862306 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 17:21:02 crc kubenswrapper[4803]: I0320 17:21:02.293273 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 17:21:02 crc kubenswrapper[4803]: I0320 17:21:02.295945 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:21:02 crc kubenswrapper[4803]: I0320 17:21:02.296791 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 17:21:02 crc kubenswrapper[4803]: I0320 17:21:02.296845 4803 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="05b3ae066c0990823374b67ded7e4582bd6e27657b0fd707525ceda3c223c3fa" exitCode=137 Mar 20 17:21:02 crc kubenswrapper[4803]: I0320 17:21:02.296883 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"05b3ae066c0990823374b67ded7e4582bd6e27657b0fd707525ceda3c223c3fa"} Mar 20 17:21:02 crc kubenswrapper[4803]: I0320 17:21:02.296917 4803 scope.go:117] "RemoveContainer" containerID="93126eed679cf9381d281ba5f85387fecfe1fa5e889e7302ba7b6378fe1c074a" Mar 20 17:21:03 crc kubenswrapper[4803]: I0320 17:21:03.305892 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 17:21:03 crc kubenswrapper[4803]: I0320 17:21:03.307407 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 17:21:03 crc kubenswrapper[4803]: I0320 17:21:03.307477 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"807a3adf7c6de5024e4ec458de24d6b777d247003ad24c94ede2650b1d1b495d"} Mar 20 17:21:08 crc kubenswrapper[4803]: I0320 17:21:08.245941 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:21:08 crc kubenswrapper[4803]: I0320 17:21:08.246857 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:21:08 crc kubenswrapper[4803]: I0320 17:21:08.246945 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:21:08 crc kubenswrapper[4803]: I0320 17:21:08.247987 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65940b38653f640196377cdf6c52dabac29bf3708a6f546d7affca969f10c203"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:21:08 crc kubenswrapper[4803]: I0320 17:21:08.248118 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://65940b38653f640196377cdf6c52dabac29bf3708a6f546d7affca969f10c203" gracePeriod=600 Mar 20 17:21:09 crc kubenswrapper[4803]: I0320 17:21:09.355423 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="65940b38653f640196377cdf6c52dabac29bf3708a6f546d7affca969f10c203" exitCode=0 Mar 20 17:21:09 crc kubenswrapper[4803]: I0320 17:21:09.355579 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"65940b38653f640196377cdf6c52dabac29bf3708a6f546d7affca969f10c203"} Mar 20 17:21:09 crc kubenswrapper[4803]: I0320 17:21:09.356173 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"45b15f7f930be75ad9338760e525031847e940aad8524904031331b60994207d"} Mar 20 17:21:10 crc kubenswrapper[4803]: I0320 17:21:10.626167 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:21:12 crc kubenswrapper[4803]: I0320 17:21:12.099996 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:21:12 crc kubenswrapper[4803]: I0320 17:21:12.106232 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:21:12 crc kubenswrapper[4803]: I0320 17:21:12.383310 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 17:21:20 crc kubenswrapper[4803]: I0320 17:21:20.862579 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql"] Mar 20 17:21:20 crc kubenswrapper[4803]: I0320 17:21:20.865715 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" podUID="ab886873-9ec5-48f3-aea1-dc39ab313aee" containerName="route-controller-manager" containerID="cri-o://96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b" gracePeriod=30 Mar 20 17:21:20 crc kubenswrapper[4803]: I0320 17:21:20.890876 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b8c575497-lsj7r"] Mar 20 17:21:20 crc kubenswrapper[4803]: I0320 17:21:20.891119 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" podUID="d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" containerName="controller-manager" containerID="cri-o://ad0198c8aeda7a43682c4987775dc644f0b6a3d9d51417b913aa496c2ab7abfa" gracePeriod=30 Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.034712 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567120-kk4jn"] Mar 20 17:21:21 crc kubenswrapper[4803]: E0320 17:21:21.035410 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.035492 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.035663 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.036045 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.038106 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.038351 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.038491 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.041285 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-kk4jn"] Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.114998 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkhk\" (UniqueName: \"kubernetes.io/projected/ab12c524-2ba2-4454-8079-af3be98f4ccf-kube-api-access-ztkhk\") pod \"auto-csr-approver-29567120-kk4jn\" (UID: \"ab12c524-2ba2-4454-8079-af3be98f4ccf\") " pod="openshift-infra/auto-csr-approver-29567120-kk4jn" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.218636 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkhk\" (UniqueName: \"kubernetes.io/projected/ab12c524-2ba2-4454-8079-af3be98f4ccf-kube-api-access-ztkhk\") pod \"auto-csr-approver-29567120-kk4jn\" (UID: \"ab12c524-2ba2-4454-8079-af3be98f4ccf\") " pod="openshift-infra/auto-csr-approver-29567120-kk4jn" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.252603 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkhk\" (UniqueName: \"kubernetes.io/projected/ab12c524-2ba2-4454-8079-af3be98f4ccf-kube-api-access-ztkhk\") pod \"auto-csr-approver-29567120-kk4jn\" (UID: \"ab12c524-2ba2-4454-8079-af3be98f4ccf\") " pod="openshift-infra/auto-csr-approver-29567120-kk4jn" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.321236 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.416463 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.420190 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtfqp\" (UniqueName: \"kubernetes.io/projected/ab886873-9ec5-48f3-aea1-dc39ab313aee-kube-api-access-jtfqp\") pod \"ab886873-9ec5-48f3-aea1-dc39ab313aee\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.420253 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab886873-9ec5-48f3-aea1-dc39ab313aee-serving-cert\") pod \"ab886873-9ec5-48f3-aea1-dc39ab313aee\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.420309 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-config\") pod \"ab886873-9ec5-48f3-aea1-dc39ab313aee\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.420326 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-client-ca\") pod \"ab886873-9ec5-48f3-aea1-dc39ab313aee\" (UID: \"ab886873-9ec5-48f3-aea1-dc39ab313aee\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.421074 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab886873-9ec5-48f3-aea1-dc39ab313aee" (UID: "ab886873-9ec5-48f3-aea1-dc39ab313aee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.421298 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-config" (OuterVolumeSpecName: "config") pod "ab886873-9ec5-48f3-aea1-dc39ab313aee" (UID: "ab886873-9ec5-48f3-aea1-dc39ab313aee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.423459 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab886873-9ec5-48f3-aea1-dc39ab313aee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab886873-9ec5-48f3-aea1-dc39ab313aee" (UID: "ab886873-9ec5-48f3-aea1-dc39ab313aee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.426768 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab886873-9ec5-48f3-aea1-dc39ab313aee-kube-api-access-jtfqp" (OuterVolumeSpecName: "kube-api-access-jtfqp") pod "ab886873-9ec5-48f3-aea1-dc39ab313aee" (UID: "ab886873-9ec5-48f3-aea1-dc39ab313aee"). InnerVolumeSpecName "kube-api-access-jtfqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.441354 4803 generic.go:334] "Generic (PLEG): container finished" podID="d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" containerID="ad0198c8aeda7a43682c4987775dc644f0b6a3d9d51417b913aa496c2ab7abfa" exitCode=0 Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.441464 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" event={"ID":"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862","Type":"ContainerDied","Data":"ad0198c8aeda7a43682c4987775dc644f0b6a3d9d51417b913aa496c2ab7abfa"} Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.443113 4803 generic.go:334] "Generic (PLEG): container finished" podID="ab886873-9ec5-48f3-aea1-dc39ab313aee" containerID="96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b" exitCode=0 Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.443156 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" event={"ID":"ab886873-9ec5-48f3-aea1-dc39ab313aee","Type":"ContainerDied","Data":"96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b"} Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.443180 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" event={"ID":"ab886873-9ec5-48f3-aea1-dc39ab313aee","Type":"ContainerDied","Data":"75b0e52c78ab620643d7743b0d34cb908b9cdf2ec9a9a8d30592656d64713ae6"} Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.443197 4803 scope.go:117] "RemoveContainer" containerID="96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.443230 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.456484 4803 scope.go:117] "RemoveContainer" containerID="96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b" Mar 20 17:21:21 crc kubenswrapper[4803]: E0320 17:21:21.456979 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b\": container with ID starting with 96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b not found: ID does not exist" containerID="96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.457032 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b"} err="failed to get container status \"96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b\": rpc error: code = NotFound desc = could not find container \"96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b\": container with ID starting with 96e67ca216f2d1c1b1ce156e9679eb334edb3bcbcb43cb4331bb4be4623b180b not found: ID does not exist" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.471374 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql"] Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.474098 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f956fc54d-t9vql"] Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.509460 4803 patch_prober.go:28] interesting pod/controller-manager-5b8c575497-lsj7r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" start-of-body= Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.509545 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" podUID="d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": dial tcp 10.217.0.62:8443: connect: connection refused" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.522307 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab886873-9ec5-48f3-aea1-dc39ab313aee-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.522354 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.522364 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab886873-9ec5-48f3-aea1-dc39ab313aee-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.522374 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtfqp\" (UniqueName: \"kubernetes.io/projected/ab886873-9ec5-48f3-aea1-dc39ab313aee-kube-api-access-jtfqp\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.778023 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.826493 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-config\") pod \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.826708 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-proxy-ca-bundles\") pod \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.826826 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-serving-cert\") pod \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.826889 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-client-ca\") pod \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.826954 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9ckj\" (UniqueName: \"kubernetes.io/projected/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-kube-api-access-f9ckj\") pod \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\" (UID: \"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862\") " Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.827272 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-config" (OuterVolumeSpecName: "config") pod "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" (UID: "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.827449 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" (UID: "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.828137 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-client-ca" (OuterVolumeSpecName: "client-ca") pod "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" (UID: "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.835842 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-kube-api-access-f9ckj" (OuterVolumeSpecName: "kube-api-access-f9ckj") pod "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" (UID: "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862"). InnerVolumeSpecName "kube-api-access-f9ckj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.835994 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" (UID: "d3ec1cd8-503c-493e-a3a5-6ff6c17b0862"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.844074 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-kk4jn"] Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.928796 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.928835 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.928850 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.928864 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:21 crc kubenswrapper[4803]: I0320 17:21:21.928877 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9ckj\" (UniqueName: \"kubernetes.io/projected/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862-kube-api-access-f9ckj\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.235308 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl"] Mar 20 17:21:22 crc kubenswrapper[4803]: E0320 17:21:22.235767 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" containerName="controller-manager" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.235779 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" containerName="controller-manager" Mar 20 17:21:22 crc kubenswrapper[4803]: E0320 17:21:22.235791 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab886873-9ec5-48f3-aea1-dc39ab313aee" containerName="route-controller-manager" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.235798 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab886873-9ec5-48f3-aea1-dc39ab313aee" containerName="route-controller-manager" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.235881 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab886873-9ec5-48f3-aea1-dc39ab313aee" containerName="route-controller-manager" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.235895 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" containerName="controller-manager" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.236229 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.244179 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd"] Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.245716 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.248409 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.248804 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.248896 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.248910 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.249507 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.251979 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.263444 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd"] Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.313501 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl"] Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333336 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q2nj\" (UniqueName: \"kubernetes.io/projected/233361dc-6f20-402b-858c-41285e4e392e-kube-api-access-8q2nj\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333406 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/233361dc-6f20-402b-858c-41285e4e392e-serving-cert\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333552 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkvz\" (UniqueName: \"kubernetes.io/projected/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-kube-api-access-cgkvz\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333596 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-config\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333638 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-client-ca\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333669 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-serving-cert\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333720 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-proxy-ca-bundles\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333755 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-config\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.333800 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-client-ca\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.434436 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkvz\" (UniqueName: \"kubernetes.io/projected/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-kube-api-access-cgkvz\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.434513 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-config\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.434582 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-client-ca\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.434615 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-serving-cert\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.434670 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-proxy-ca-bundles\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.434703 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-config\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.435915 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-client-ca\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.435983 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q2nj\" (UniqueName: \"kubernetes.io/projected/233361dc-6f20-402b-858c-41285e4e392e-kube-api-access-8q2nj\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.436017 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/233361dc-6f20-402b-858c-41285e4e392e-serving-cert\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.436674 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-proxy-ca-bundles\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.436735 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-client-ca\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.436826 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-client-ca\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.437732 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-config\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.436905 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-config\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.442165 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-serving-cert\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.455732 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/233361dc-6f20-402b-858c-41285e4e392e-serving-cert\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.457096 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" event={"ID":"d3ec1cd8-503c-493e-a3a5-6ff6c17b0862","Type":"ContainerDied","Data":"d991283f3669cc94c354e07ac177f9d1d3a7e38cdce6b40dc9e059270714c33e"} Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.457163 4803 scope.go:117] "RemoveContainer" containerID="ad0198c8aeda7a43682c4987775dc644f0b6a3d9d51417b913aa496c2ab7abfa" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.457285 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8c575497-lsj7r" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.458276 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkvz\" (UniqueName: \"kubernetes.io/projected/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-kube-api-access-cgkvz\") pod \"route-controller-manager-6c98d4566d-zqztd\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.464705 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q2nj\" (UniqueName: \"kubernetes.io/projected/233361dc-6f20-402b-858c-41285e4e392e-kube-api-access-8q2nj\") pod \"controller-manager-6bbc9685c9-s6bwl\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.475061 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" event={"ID":"ab12c524-2ba2-4454-8079-af3be98f4ccf","Type":"ContainerStarted","Data":"a8cea7bf3f41c311d27941125e7470877c004df7a2053e18da861d666ef4bafb"} Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.527890 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b8c575497-lsj7r"] Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.534723 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b8c575497-lsj7r"] Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.552719 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.558770 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.859755 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab886873-9ec5-48f3-aea1-dc39ab313aee" path="/var/lib/kubelet/pods/ab886873-9ec5-48f3-aea1-dc39ab313aee/volumes" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.860580 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ec1cd8-503c-493e-a3a5-6ff6c17b0862" path="/var/lib/kubelet/pods/d3ec1cd8-503c-493e-a3a5-6ff6c17b0862/volumes" Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.873458 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd"] Mar 20 17:21:22 crc kubenswrapper[4803]: W0320 17:21:22.881172 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2d666d7_6e34_4fa0_9c53_8ebd7985b1f7.slice/crio-6370eb41ee6261af146cd9f3a8cf0a648953def6b1586c8d568bbe195890d127 WatchSource:0}: Error finding container 6370eb41ee6261af146cd9f3a8cf0a648953def6b1586c8d568bbe195890d127: Status 404 returned error can't find the container with id 6370eb41ee6261af146cd9f3a8cf0a648953def6b1586c8d568bbe195890d127 Mar 20 17:21:22 crc kubenswrapper[4803]: W0320 17:21:22.909395 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod233361dc_6f20_402b_858c_41285e4e392e.slice/crio-72dacc2fbd2c90ab6abc424ce80cfc2097573895a162badf1946a91e03b39048 WatchSource:0}: Error finding container 72dacc2fbd2c90ab6abc424ce80cfc2097573895a162badf1946a91e03b39048: Status 404 returned error can't find the container with id 72dacc2fbd2c90ab6abc424ce80cfc2097573895a162badf1946a91e03b39048 Mar 20 17:21:22 crc kubenswrapper[4803]: I0320 17:21:22.909908 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl"] Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.482189 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" event={"ID":"233361dc-6f20-402b-858c-41285e4e392e","Type":"ContainerStarted","Data":"46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5"} Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.482252 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" event={"ID":"233361dc-6f20-402b-858c-41285e4e392e","Type":"ContainerStarted","Data":"72dacc2fbd2c90ab6abc424ce80cfc2097573895a162badf1946a91e03b39048"} Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.482375 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.483254 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" event={"ID":"ab12c524-2ba2-4454-8079-af3be98f4ccf","Type":"ContainerStarted","Data":"c954e60afe47b8d8245fb3678f7218c2125d7f2a12a3ddf401469e2b3ba49531"} Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.484507 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" event={"ID":"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7","Type":"ContainerStarted","Data":"7925de67b880ebe7cc05d173889920f7c15ccafe76bf695ea430239d9148b980"} Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.484569 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" event={"ID":"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7","Type":"ContainerStarted","Data":"6370eb41ee6261af146cd9f3a8cf0a648953def6b1586c8d568bbe195890d127"} Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.484707 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.486339 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.502944 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" podStartSLOduration=3.5029242 podStartE2EDuration="3.5029242s" podCreationTimestamp="2026-03-20 17:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:23.500899618 +0000 UTC m=+293.412491728" watchObservedRunningTime="2026-03-20 17:21:23.5029242 +0000 UTC m=+293.414516290" Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.517695 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" podStartSLOduration=2.234895008 podStartE2EDuration="3.517679178s" podCreationTimestamp="2026-03-20 17:21:20 +0000 UTC" firstStartedPulling="2026-03-20 17:21:21.85516385 +0000 UTC m=+291.766755920" lastFinishedPulling="2026-03-20 17:21:23.13794801 +0000 UTC m=+293.049540090" observedRunningTime="2026-03-20 17:21:23.513452439 +0000 UTC m=+293.425044529" watchObservedRunningTime="2026-03-20 17:21:23.517679178 +0000 UTC m=+293.429271268" Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.536285 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" podStartSLOduration=3.536268292 podStartE2EDuration="3.536268292s" podCreationTimestamp="2026-03-20 17:21:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:23.532341953 +0000 UTC m=+293.443934033" watchObservedRunningTime="2026-03-20 17:21:23.536268292 +0000 UTC m=+293.447860362" Mar 20 17:21:23 crc kubenswrapper[4803]: I0320 17:21:23.583836 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:24 crc kubenswrapper[4803]: I0320 17:21:24.492484 4803 generic.go:334] "Generic (PLEG): container finished" podID="ab12c524-2ba2-4454-8079-af3be98f4ccf" containerID="c954e60afe47b8d8245fb3678f7218c2125d7f2a12a3ddf401469e2b3ba49531" exitCode=0 Mar 20 17:21:24 crc kubenswrapper[4803]: I0320 17:21:24.492578 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" event={"ID":"ab12c524-2ba2-4454-8079-af3be98f4ccf","Type":"ContainerDied","Data":"c954e60afe47b8d8245fb3678f7218c2125d7f2a12a3ddf401469e2b3ba49531"} Mar 20 17:21:25 crc kubenswrapper[4803]: I0320 17:21:25.758350 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" Mar 20 17:21:25 crc kubenswrapper[4803]: I0320 17:21:25.789371 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztkhk\" (UniqueName: \"kubernetes.io/projected/ab12c524-2ba2-4454-8079-af3be98f4ccf-kube-api-access-ztkhk\") pod \"ab12c524-2ba2-4454-8079-af3be98f4ccf\" (UID: \"ab12c524-2ba2-4454-8079-af3be98f4ccf\") " Mar 20 17:21:25 crc kubenswrapper[4803]: I0320 17:21:25.796545 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab12c524-2ba2-4454-8079-af3be98f4ccf-kube-api-access-ztkhk" (OuterVolumeSpecName: "kube-api-access-ztkhk") pod "ab12c524-2ba2-4454-8079-af3be98f4ccf" (UID: "ab12c524-2ba2-4454-8079-af3be98f4ccf"). InnerVolumeSpecName "kube-api-access-ztkhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:25 crc kubenswrapper[4803]: I0320 17:21:25.890971 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztkhk\" (UniqueName: \"kubernetes.io/projected/ab12c524-2ba2-4454-8079-af3be98f4ccf-kube-api-access-ztkhk\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:26 crc kubenswrapper[4803]: I0320 17:21:26.505937 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" event={"ID":"ab12c524-2ba2-4454-8079-af3be98f4ccf","Type":"ContainerDied","Data":"a8cea7bf3f41c311d27941125e7470877c004df7a2053e18da861d666ef4bafb"} Mar 20 17:21:26 crc kubenswrapper[4803]: I0320 17:21:26.505980 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8cea7bf3f41c311d27941125e7470877c004df7a2053e18da861d666ef4bafb" Mar 20 17:21:26 crc kubenswrapper[4803]: I0320 17:21:26.506000 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567120-kk4jn" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.297611 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd"] Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.298309 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" podUID="c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" containerName="route-controller-manager" containerID="cri-o://7925de67b880ebe7cc05d173889920f7c15ccafe76bf695ea430239d9148b980" gracePeriod=30 Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.572073 4803 generic.go:334] "Generic (PLEG): container finished" podID="c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" containerID="7925de67b880ebe7cc05d173889920f7c15ccafe76bf695ea430239d9148b980" exitCode=0 Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.572128 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" event={"ID":"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7","Type":"ContainerDied","Data":"7925de67b880ebe7cc05d173889920f7c15ccafe76bf695ea430239d9148b980"} Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.732455 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.845591 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgkvz\" (UniqueName: \"kubernetes.io/projected/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-kube-api-access-cgkvz\") pod \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.845633 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-config\") pod \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.845689 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-client-ca\") pod \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.845749 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-serving-cert\") pod \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\" (UID: \"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7\") " Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.846672 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-config" (OuterVolumeSpecName: "config") pod "c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" (UID: "c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.846688 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-client-ca" (OuterVolumeSpecName: "client-ca") pod "c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" (UID: "c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.852090 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-kube-api-access-cgkvz" (OuterVolumeSpecName: "kube-api-access-cgkvz") pod "c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" (UID: "c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7"). InnerVolumeSpecName "kube-api-access-cgkvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.852139 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" (UID: "c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.946745 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgkvz\" (UniqueName: \"kubernetes.io/projected/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-kube-api-access-cgkvz\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.946779 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.946789 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:37 crc kubenswrapper[4803]: I0320 17:21:37.946798 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:38 crc kubenswrapper[4803]: I0320 17:21:38.583913 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" event={"ID":"c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7","Type":"ContainerDied","Data":"6370eb41ee6261af146cd9f3a8cf0a648953def6b1586c8d568bbe195890d127"} Mar 20 17:21:38 crc kubenswrapper[4803]: I0320 17:21:38.583965 4803 scope.go:117] "RemoveContainer" containerID="7925de67b880ebe7cc05d173889920f7c15ccafe76bf695ea430239d9148b980" Mar 20 17:21:38 crc kubenswrapper[4803]: I0320 17:21:38.584026 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd" Mar 20 17:21:38 crc kubenswrapper[4803]: I0320 17:21:38.621158 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd"] Mar 20 17:21:38 crc kubenswrapper[4803]: I0320 17:21:38.624422 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98d4566d-zqztd"] Mar 20 17:21:38 crc kubenswrapper[4803]: I0320 17:21:38.855040 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" path="/var/lib/kubelet/pods/c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7/volumes" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.269992 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w"] Mar 20 17:21:39 crc kubenswrapper[4803]: E0320 17:21:39.270299 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab12c524-2ba2-4454-8079-af3be98f4ccf" containerName="oc" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.270316 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab12c524-2ba2-4454-8079-af3be98f4ccf" containerName="oc" Mar 20 17:21:39 crc kubenswrapper[4803]: E0320 17:21:39.270337 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" containerName="route-controller-manager" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.270345 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" containerName="route-controller-manager" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.270498 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d666d7-6e34-4fa0-9c53-8ebd7985b1f7" containerName="route-controller-manager" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.270515 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab12c524-2ba2-4454-8079-af3be98f4ccf" containerName="oc" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.271039 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.273470 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.273719 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.273845 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.273986 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.278974 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w"] Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.287867 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.288048 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.364730 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6035c4-2425-43d4-a31a-da69456788f5-config\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.364791 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f6035c4-2425-43d4-a31a-da69456788f5-serving-cert\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.364823 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f6035c4-2425-43d4-a31a-da69456788f5-client-ca\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.364891 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc5v4\" (UniqueName: \"kubernetes.io/projected/2f6035c4-2425-43d4-a31a-da69456788f5-kube-api-access-jc5v4\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.465888 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc5v4\" (UniqueName: \"kubernetes.io/projected/2f6035c4-2425-43d4-a31a-da69456788f5-kube-api-access-jc5v4\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.465955 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6035c4-2425-43d4-a31a-da69456788f5-config\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.465987 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f6035c4-2425-43d4-a31a-da69456788f5-serving-cert\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.466005 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f6035c4-2425-43d4-a31a-da69456788f5-client-ca\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.466901 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f6035c4-2425-43d4-a31a-da69456788f5-client-ca\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.467764 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f6035c4-2425-43d4-a31a-da69456788f5-config\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.469981 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f6035c4-2425-43d4-a31a-da69456788f5-serving-cert\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.481622 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc5v4\" (UniqueName: \"kubernetes.io/projected/2f6035c4-2425-43d4-a31a-da69456788f5-kube-api-access-jc5v4\") pod \"route-controller-manager-86c69f7999-qdc9w\" (UID: \"2f6035c4-2425-43d4-a31a-da69456788f5\") " pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:39 crc kubenswrapper[4803]: I0320 17:21:39.591412 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:40 crc kubenswrapper[4803]: I0320 17:21:40.076835 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w"] Mar 20 17:21:40 crc kubenswrapper[4803]: I0320 17:21:40.601035 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" event={"ID":"2f6035c4-2425-43d4-a31a-da69456788f5","Type":"ContainerStarted","Data":"2b57abb5298fb12614d7f19dc27b48b313c0b9e33d39647a3d14341d7a17e27c"} Mar 20 17:21:40 crc kubenswrapper[4803]: I0320 17:21:40.601792 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:40 crc kubenswrapper[4803]: I0320 17:21:40.601875 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" event={"ID":"2f6035c4-2425-43d4-a31a-da69456788f5","Type":"ContainerStarted","Data":"b340604e83b5a866d6d7e52e89ca8615a6585c62e50244fca5a2fb3fc306160d"} Mar 20 17:21:40 crc kubenswrapper[4803]: I0320 17:21:40.623644 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" podStartSLOduration=3.623627155 podStartE2EDuration="3.623627155s" podCreationTimestamp="2026-03-20 17:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:21:40.620943114 +0000 UTC m=+310.532535194" watchObservedRunningTime="2026-03-20 17:21:40.623627155 +0000 UTC m=+310.535219225" Mar 20 17:21:40 crc kubenswrapper[4803]: I0320 17:21:40.815938 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-86c69f7999-qdc9w" Mar 20 17:21:57 crc kubenswrapper[4803]: I0320 17:21:57.699978 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl"] Mar 20 17:21:57 crc kubenswrapper[4803]: I0320 17:21:57.700845 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" podUID="233361dc-6f20-402b-858c-41285e4e392e" containerName="controller-manager" containerID="cri-o://46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5" gracePeriod=30 Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.230710 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.346405 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/233361dc-6f20-402b-858c-41285e4e392e-serving-cert\") pod \"233361dc-6f20-402b-858c-41285e4e392e\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.346736 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-client-ca\") pod \"233361dc-6f20-402b-858c-41285e4e392e\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.346816 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-proxy-ca-bundles\") pod \"233361dc-6f20-402b-858c-41285e4e392e\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.346848 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q2nj\" (UniqueName: \"kubernetes.io/projected/233361dc-6f20-402b-858c-41285e4e392e-kube-api-access-8q2nj\") pod \"233361dc-6f20-402b-858c-41285e4e392e\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.346872 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-config\") pod \"233361dc-6f20-402b-858c-41285e4e392e\" (UID: \"233361dc-6f20-402b-858c-41285e4e392e\") " Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.347990 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-config" (OuterVolumeSpecName: "config") pod "233361dc-6f20-402b-858c-41285e4e392e" (UID: "233361dc-6f20-402b-858c-41285e4e392e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.348052 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-client-ca" (OuterVolumeSpecName: "client-ca") pod "233361dc-6f20-402b-858c-41285e4e392e" (UID: "233361dc-6f20-402b-858c-41285e4e392e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.348306 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "233361dc-6f20-402b-858c-41285e4e392e" (UID: "233361dc-6f20-402b-858c-41285e4e392e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.352354 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/233361dc-6f20-402b-858c-41285e4e392e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "233361dc-6f20-402b-858c-41285e4e392e" (UID: "233361dc-6f20-402b-858c-41285e4e392e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.352390 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/233361dc-6f20-402b-858c-41285e4e392e-kube-api-access-8q2nj" (OuterVolumeSpecName: "kube-api-access-8q2nj") pod "233361dc-6f20-402b-858c-41285e4e392e" (UID: "233361dc-6f20-402b-858c-41285e4e392e"). InnerVolumeSpecName "kube-api-access-8q2nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.448057 4803 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/233361dc-6f20-402b-858c-41285e4e392e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.448095 4803 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.448104 4803 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.448115 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q2nj\" (UniqueName: \"kubernetes.io/projected/233361dc-6f20-402b-858c-41285e4e392e-kube-api-access-8q2nj\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.448123 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/233361dc-6f20-402b-858c-41285e4e392e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.714420 4803 generic.go:334] "Generic (PLEG): container finished" podID="233361dc-6f20-402b-858c-41285e4e392e" containerID="46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5" exitCode=0 Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.714461 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" event={"ID":"233361dc-6f20-402b-858c-41285e4e392e","Type":"ContainerDied","Data":"46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5"} Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.714493 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" event={"ID":"233361dc-6f20-402b-858c-41285e4e392e","Type":"ContainerDied","Data":"72dacc2fbd2c90ab6abc424ce80cfc2097573895a162badf1946a91e03b39048"} Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.714510 4803 scope.go:117] "RemoveContainer" containerID="46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.714517 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.752330 4803 scope.go:117] "RemoveContainer" containerID="46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5" Mar 20 17:21:58 crc kubenswrapper[4803]: E0320 17:21:58.752877 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5\": container with ID starting with 46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5 not found: ID does not exist" containerID="46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.752938 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5"} err="failed to get container status \"46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5\": rpc error: code = NotFound desc = could not find container \"46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5\": container with ID starting with 46b644591efec7eff7150bf591ec3ff5e288b47ac70208323b8802e7717b3bb5 not found: ID does not exist" Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.773845 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl"] Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.786432 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bbc9685c9-s6bwl"] Mar 20 17:21:58 crc kubenswrapper[4803]: I0320 17:21:58.857600 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="233361dc-6f20-402b-858c-41285e4e392e" path="/var/lib/kubelet/pods/233361dc-6f20-402b-858c-41285e4e392e/volumes" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.273097 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b4c86b479-jkzrr"] Mar 20 17:21:59 crc kubenswrapper[4803]: E0320 17:21:59.273470 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="233361dc-6f20-402b-858c-41285e4e392e" containerName="controller-manager" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.273514 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="233361dc-6f20-402b-858c-41285e4e392e" containerName="controller-manager" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.273862 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="233361dc-6f20-402b-858c-41285e4e392e" containerName="controller-manager" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.274681 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.279640 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.280029 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.280360 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.282145 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.282437 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.282944 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.285474 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b4c86b479-jkzrr"] Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.292691 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.359225 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zbnt\" (UniqueName: \"kubernetes.io/projected/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-kube-api-access-8zbnt\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.359333 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-client-ca\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.359380 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-serving-cert\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.359626 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-config\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.359680 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-proxy-ca-bundles\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.460797 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-client-ca\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.460879 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-serving-cert\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.460971 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-config\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.461006 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-proxy-ca-bundles\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.461087 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zbnt\" (UniqueName: \"kubernetes.io/projected/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-kube-api-access-8zbnt\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.462513 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-client-ca\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.462872 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-config\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.463094 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-proxy-ca-bundles\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.468174 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-serving-cert\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.477592 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zbnt\" (UniqueName: \"kubernetes.io/projected/2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73-kube-api-access-8zbnt\") pod \"controller-manager-6b4c86b479-jkzrr\" (UID: \"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73\") " pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.598716 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.611468 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cw85j"] Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.612432 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.623752 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cw85j"] Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.662899 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-registry-tls\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.662941 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbs9x\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-kube-api-access-fbs9x\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.662965 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3794adfa-4dd3-44c0-b03c-3d4213e9e555-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.662997 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.663032 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-bound-sa-token\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.663060 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3794adfa-4dd3-44c0-b03c-3d4213e9e555-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.663082 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3794adfa-4dd3-44c0-b03c-3d4213e9e555-trusted-ca\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.663099 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3794adfa-4dd3-44c0-b03c-3d4213e9e555-registry-certificates\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.702885 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.765120 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbs9x\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-kube-api-access-fbs9x\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.765161 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3794adfa-4dd3-44c0-b03c-3d4213e9e555-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.765212 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-bound-sa-token\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.765691 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3794adfa-4dd3-44c0-b03c-3d4213e9e555-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.765710 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3794adfa-4dd3-44c0-b03c-3d4213e9e555-trusted-ca\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.765727 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3794adfa-4dd3-44c0-b03c-3d4213e9e555-registry-certificates\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.765754 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-registry-tls\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.767272 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3794adfa-4dd3-44c0-b03c-3d4213e9e555-registry-certificates\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.767300 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3794adfa-4dd3-44c0-b03c-3d4213e9e555-ca-trust-extracted\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.768298 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3794adfa-4dd3-44c0-b03c-3d4213e9e555-trusted-ca\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.773691 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3794adfa-4dd3-44c0-b03c-3d4213e9e555-installation-pull-secrets\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.774675 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-registry-tls\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.784030 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-bound-sa-token\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.784689 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbs9x\" (UniqueName: \"kubernetes.io/projected/3794adfa-4dd3-44c0-b03c-3d4213e9e555-kube-api-access-fbs9x\") pod \"image-registry-66df7c8f76-cw85j\" (UID: \"3794adfa-4dd3-44c0-b03c-3d4213e9e555\") " pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:21:59 crc kubenswrapper[4803]: I0320 17:21:59.961906 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.069749 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b4c86b479-jkzrr"] Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.145951 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567122-nz79x"] Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.147201 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-nz79x" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.149756 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.149785 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.150013 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.172171 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-nz79x"] Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.240384 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-cw85j"] Mar 20 17:22:00 crc kubenswrapper[4803]: W0320 17:22:00.249872 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3794adfa_4dd3_44c0_b03c_3d4213e9e555.slice/crio-d16a76a87dd467838e604dc4b135d2c72c295aa43c6ddcc69bb7b2e6be9127da WatchSource:0}: Error finding container d16a76a87dd467838e604dc4b135d2c72c295aa43c6ddcc69bb7b2e6be9127da: Status 404 returned error can't find the container with id d16a76a87dd467838e604dc4b135d2c72c295aa43c6ddcc69bb7b2e6be9127da Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.270437 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjqmp\" (UniqueName: \"kubernetes.io/projected/c857ab36-26c2-4a5a-8ab3-c92ef681c2ac-kube-api-access-bjqmp\") pod \"auto-csr-approver-29567122-nz79x\" (UID: \"c857ab36-26c2-4a5a-8ab3-c92ef681c2ac\") " pod="openshift-infra/auto-csr-approver-29567122-nz79x" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.371976 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjqmp\" (UniqueName: \"kubernetes.io/projected/c857ab36-26c2-4a5a-8ab3-c92ef681c2ac-kube-api-access-bjqmp\") pod \"auto-csr-approver-29567122-nz79x\" (UID: \"c857ab36-26c2-4a5a-8ab3-c92ef681c2ac\") " pod="openshift-infra/auto-csr-approver-29567122-nz79x" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.392335 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjqmp\" (UniqueName: \"kubernetes.io/projected/c857ab36-26c2-4a5a-8ab3-c92ef681c2ac-kube-api-access-bjqmp\") pod \"auto-csr-approver-29567122-nz79x\" (UID: \"c857ab36-26c2-4a5a-8ab3-c92ef681c2ac\") " pod="openshift-infra/auto-csr-approver-29567122-nz79x" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.483542 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-nz79x" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.725444 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" event={"ID":"3794adfa-4dd3-44c0-b03c-3d4213e9e555","Type":"ContainerStarted","Data":"4e364ea006d80edcab76fc7c6a3bf21339ba532579f08a54d35a1d46d0d00dfb"} Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.725928 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.725944 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" event={"ID":"3794adfa-4dd3-44c0-b03c-3d4213e9e555","Type":"ContainerStarted","Data":"d16a76a87dd467838e604dc4b135d2c72c295aa43c6ddcc69bb7b2e6be9127da"} Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.727140 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" event={"ID":"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73","Type":"ContainerStarted","Data":"64a82b3b898a545430a6c4a9bf016003644c493f43f76c291009e71b2c83be53"} Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.727165 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" event={"ID":"2d7c2b13-4fd1-4aa3-ad72-b5119e2c3c73","Type":"ContainerStarted","Data":"4d9d728c152dc1eb386b9941e66c2695fd5a4402f8fc9cc1a489a5aeb4fc2e18"} Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.727797 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.737706 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.750409 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" podStartSLOduration=1.750394153 podStartE2EDuration="1.750394153s" podCreationTimestamp="2026-03-20 17:21:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:00.74559058 +0000 UTC m=+330.657182660" watchObservedRunningTime="2026-03-20 17:22:00.750394153 +0000 UTC m=+330.661986223" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.764557 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b4c86b479-jkzrr" podStartSLOduration=3.764542272 podStartE2EDuration="3.764542272s" podCreationTimestamp="2026-03-20 17:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:00.762355398 +0000 UTC m=+330.673947478" watchObservedRunningTime="2026-03-20 17:22:00.764542272 +0000 UTC m=+330.676134342" Mar 20 17:22:00 crc kubenswrapper[4803]: I0320 17:22:00.796107 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-nz79x"] Mar 20 17:22:01 crc kubenswrapper[4803]: I0320 17:22:01.736805 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-nz79x" event={"ID":"c857ab36-26c2-4a5a-8ab3-c92ef681c2ac","Type":"ContainerStarted","Data":"f66a517dfdb44f6eb3587115aeab95766df0ca3e66a5c8714c1a74832daf9840"} Mar 20 17:22:02 crc kubenswrapper[4803]: I0320 17:22:02.744726 4803 generic.go:334] "Generic (PLEG): container finished" podID="c857ab36-26c2-4a5a-8ab3-c92ef681c2ac" containerID="5e587bbaff39129ebfc02d71b8e2b857d989fd5c0ce4afc73eaf69943e0b3a50" exitCode=0 Mar 20 17:22:02 crc kubenswrapper[4803]: I0320 17:22:02.744791 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-nz79x" event={"ID":"c857ab36-26c2-4a5a-8ab3-c92ef681c2ac","Type":"ContainerDied","Data":"5e587bbaff39129ebfc02d71b8e2b857d989fd5c0ce4afc73eaf69943e0b3a50"} Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.797536 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gddcq"] Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.798434 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gddcq" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerName="registry-server" containerID="cri-o://f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b" gracePeriod=30 Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.806586 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjntl"] Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.806893 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gjntl" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerName="registry-server" containerID="cri-o://8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f" gracePeriod=30 Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.821548 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g6tsw"] Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.821807 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" podUID="50a15cf7-9fc3-45dc-a960-a85e930f8365" containerName="marketplace-operator" containerID="cri-o://8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713" gracePeriod=30 Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.853443 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr2dq"] Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.853835 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mr2dq" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerName="registry-server" containerID="cri-o://c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135" gracePeriod=30 Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.863468 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s92b2"] Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.865910 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.874864 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t78tx"] Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.875198 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t78tx" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="registry-server" containerID="cri-o://455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c" gracePeriod=30 Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.883114 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s92b2"] Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.956348 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmjhf\" (UniqueName: \"kubernetes.io/projected/617ba1c8-b42d-4e9c-8d5b-6a903f267358-kube-api-access-jmjhf\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.956397 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/617ba1c8-b42d-4e9c-8d5b-6a903f267358-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:03 crc kubenswrapper[4803]: I0320 17:22:03.956552 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/617ba1c8-b42d-4e9c-8d5b-6a903f267358-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.057397 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmjhf\" (UniqueName: \"kubernetes.io/projected/617ba1c8-b42d-4e9c-8d5b-6a903f267358-kube-api-access-jmjhf\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.057723 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/617ba1c8-b42d-4e9c-8d5b-6a903f267358-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.057801 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/617ba1c8-b42d-4e9c-8d5b-6a903f267358-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.058766 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/617ba1c8-b42d-4e9c-8d5b-6a903f267358-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.065350 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/617ba1c8-b42d-4e9c-8d5b-6a903f267358-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.073012 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmjhf\" (UniqueName: \"kubernetes.io/projected/617ba1c8-b42d-4e9c-8d5b-6a903f267358-kube-api-access-jmjhf\") pod \"marketplace-operator-79b997595-s92b2\" (UID: \"617ba1c8-b42d-4e9c-8d5b-6a903f267358\") " pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.206779 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.229653 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-nz79x" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.261584 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjqmp\" (UniqueName: \"kubernetes.io/projected/c857ab36-26c2-4a5a-8ab3-c92ef681c2ac-kube-api-access-bjqmp\") pod \"c857ab36-26c2-4a5a-8ab3-c92ef681c2ac\" (UID: \"c857ab36-26c2-4a5a-8ab3-c92ef681c2ac\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.268662 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c857ab36-26c2-4a5a-8ab3-c92ef681c2ac-kube-api-access-bjqmp" (OuterVolumeSpecName: "kube-api-access-bjqmp") pod "c857ab36-26c2-4a5a-8ab3-c92ef681c2ac" (UID: "c857ab36-26c2-4a5a-8ab3-c92ef681c2ac"). InnerVolumeSpecName "kube-api-access-bjqmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.362897 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjqmp\" (UniqueName: \"kubernetes.io/projected/c857ab36-26c2-4a5a-8ab3-c92ef681c2ac-kube-api-access-bjqmp\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.394395 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.463318 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-catalog-content\") pod \"a5db8851-4faf-41b9-9f19-56ae943e1f07\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.463599 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8fms\" (UniqueName: \"kubernetes.io/projected/a5db8851-4faf-41b9-9f19-56ae943e1f07-kube-api-access-z8fms\") pod \"a5db8851-4faf-41b9-9f19-56ae943e1f07\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.463728 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-utilities\") pod \"a5db8851-4faf-41b9-9f19-56ae943e1f07\" (UID: \"a5db8851-4faf-41b9-9f19-56ae943e1f07\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.468588 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-utilities" (OuterVolumeSpecName: "utilities") pod "a5db8851-4faf-41b9-9f19-56ae943e1f07" (UID: "a5db8851-4faf-41b9-9f19-56ae943e1f07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.471000 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.471334 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.475234 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5db8851-4faf-41b9-9f19-56ae943e1f07-kube-api-access-z8fms" (OuterVolumeSpecName: "kube-api-access-z8fms") pod "a5db8851-4faf-41b9-9f19-56ae943e1f07" (UID: "a5db8851-4faf-41b9-9f19-56ae943e1f07"). InnerVolumeSpecName "kube-api-access-z8fms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.492823 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.494582 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.533830 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5db8851-4faf-41b9-9f19-56ae943e1f07" (UID: "a5db8851-4faf-41b9-9f19-56ae943e1f07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573013 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdrbd\" (UniqueName: \"kubernetes.io/projected/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-kube-api-access-pdrbd\") pod \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573061 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-catalog-content\") pod \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573088 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vc4q\" (UniqueName: \"kubernetes.io/projected/50a15cf7-9fc3-45dc-a960-a85e930f8365-kube-api-access-4vc4q\") pod \"50a15cf7-9fc3-45dc-a960-a85e930f8365\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573109 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-utilities\") pod \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573140 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-trusted-ca\") pod \"50a15cf7-9fc3-45dc-a960-a85e930f8365\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573159 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8r5q\" (UniqueName: \"kubernetes.io/projected/139768c1-c8fa-4890-952b-2a9f3e152ca3-kube-api-access-q8r5q\") pod \"139768c1-c8fa-4890-952b-2a9f3e152ca3\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573206 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-operator-metrics\") pod \"50a15cf7-9fc3-45dc-a960-a85e930f8365\" (UID: \"50a15cf7-9fc3-45dc-a960-a85e930f8365\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573227 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-catalog-content\") pod \"139768c1-c8fa-4890-952b-2a9f3e152ca3\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573251 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-catalog-content\") pod \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\" (UID: \"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573269 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vh4n\" (UniqueName: \"kubernetes.io/projected/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-kube-api-access-9vh4n\") pod \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573307 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-utilities\") pod \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\" (UID: \"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573330 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-utilities\") pod \"139768c1-c8fa-4890-952b-2a9f3e152ca3\" (UID: \"139768c1-c8fa-4890-952b-2a9f3e152ca3\") " Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573568 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573579 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5db8851-4faf-41b9-9f19-56ae943e1f07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.573589 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8fms\" (UniqueName: \"kubernetes.io/projected/a5db8851-4faf-41b9-9f19-56ae943e1f07-kube-api-access-z8fms\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.576005 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "50a15cf7-9fc3-45dc-a960-a85e930f8365" (UID: "50a15cf7-9fc3-45dc-a960-a85e930f8365"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.576348 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-kube-api-access-pdrbd" (OuterVolumeSpecName: "kube-api-access-pdrbd") pod "46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" (UID: "46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3"). InnerVolumeSpecName "kube-api-access-pdrbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.578444 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-utilities" (OuterVolumeSpecName: "utilities") pod "139768c1-c8fa-4890-952b-2a9f3e152ca3" (UID: "139768c1-c8fa-4890-952b-2a9f3e152ca3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.579957 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a15cf7-9fc3-45dc-a960-a85e930f8365-kube-api-access-4vc4q" (OuterVolumeSpecName: "kube-api-access-4vc4q") pod "50a15cf7-9fc3-45dc-a960-a85e930f8365" (UID: "50a15cf7-9fc3-45dc-a960-a85e930f8365"). InnerVolumeSpecName "kube-api-access-4vc4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.580631 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-utilities" (OuterVolumeSpecName: "utilities") pod "46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" (UID: "46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.580961 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-kube-api-access-9vh4n" (OuterVolumeSpecName: "kube-api-access-9vh4n") pod "baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" (UID: "baf113de-0d1a-4ddf-9ed5-01e25b1bb66e"). InnerVolumeSpecName "kube-api-access-9vh4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.581147 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "50a15cf7-9fc3-45dc-a960-a85e930f8365" (UID: "50a15cf7-9fc3-45dc-a960-a85e930f8365"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.581821 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-utilities" (OuterVolumeSpecName: "utilities") pod "baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" (UID: "baf113de-0d1a-4ddf-9ed5-01e25b1bb66e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.583901 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/139768c1-c8fa-4890-952b-2a9f3e152ca3-kube-api-access-q8r5q" (OuterVolumeSpecName: "kube-api-access-q8r5q") pod "139768c1-c8fa-4890-952b-2a9f3e152ca3" (UID: "139768c1-c8fa-4890-952b-2a9f3e152ca3"). InnerVolumeSpecName "kube-api-access-q8r5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.628159 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" (UID: "baf113de-0d1a-4ddf-9ed5-01e25b1bb66e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.632791 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" (UID: "46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675172 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdrbd\" (UniqueName: \"kubernetes.io/projected/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-kube-api-access-pdrbd\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675209 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675221 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vc4q\" (UniqueName: \"kubernetes.io/projected/50a15cf7-9fc3-45dc-a960-a85e930f8365-kube-api-access-4vc4q\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675233 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675247 4803 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675259 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8r5q\" (UniqueName: \"kubernetes.io/projected/139768c1-c8fa-4890-952b-2a9f3e152ca3-kube-api-access-q8r5q\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675270 4803 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/50a15cf7-9fc3-45dc-a960-a85e930f8365-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675283 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675299 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vh4n\" (UniqueName: \"kubernetes.io/projected/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-kube-api-access-9vh4n\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675310 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.675322 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.713099 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "139768c1-c8fa-4890-952b-2a9f3e152ca3" (UID: "139768c1-c8fa-4890-952b-2a9f3e152ca3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.759286 4803 generic.go:334] "Generic (PLEG): container finished" podID="50a15cf7-9fc3-45dc-a960-a85e930f8365" containerID="8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713" exitCode=0 Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.759370 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" event={"ID":"50a15cf7-9fc3-45dc-a960-a85e930f8365","Type":"ContainerDied","Data":"8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.759416 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" event={"ID":"50a15cf7-9fc3-45dc-a960-a85e930f8365","Type":"ContainerDied","Data":"75ff172ae9a57e498a5f8ed0422d73b3e9286926699d2e857815a6b6459caeee"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.759433 4803 scope.go:117] "RemoveContainer" containerID="8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.759560 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-g6tsw" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.762866 4803 generic.go:334] "Generic (PLEG): container finished" podID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerID="c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135" exitCode=0 Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.762921 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr2dq" event={"ID":"a5db8851-4faf-41b9-9f19-56ae943e1f07","Type":"ContainerDied","Data":"c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.762938 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mr2dq" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.762957 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mr2dq" event={"ID":"a5db8851-4faf-41b9-9f19-56ae943e1f07","Type":"ContainerDied","Data":"05ec152d2b12af2991e4e227047c7db968a1983b123f5774695cc57ce0d81404"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.764937 4803 generic.go:334] "Generic (PLEG): container finished" podID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerID="8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f" exitCode=0 Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.765043 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjntl" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.765072 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjntl" event={"ID":"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3","Type":"ContainerDied","Data":"8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.765115 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjntl" event={"ID":"46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3","Type":"ContainerDied","Data":"8fe02b7fc67557ba58a009bd40e8fb745dbad605f300b1392512c4b80dc29a19"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.770936 4803 generic.go:334] "Generic (PLEG): container finished" podID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerID="f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b" exitCode=0 Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.771032 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gddcq" event={"ID":"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e","Type":"ContainerDied","Data":"f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.771062 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gddcq" event={"ID":"baf113de-0d1a-4ddf-9ed5-01e25b1bb66e","Type":"ContainerDied","Data":"a5a26bf058d1b32792ec7d11d6a8a7aff9e1c205f08d92179a602e28890ed1f8"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.771721 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gddcq" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.776480 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/139768c1-c8fa-4890-952b-2a9f3e152ca3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.781428 4803 generic.go:334] "Generic (PLEG): container finished" podID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerID="455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c" exitCode=0 Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.781511 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t78tx" event={"ID":"139768c1-c8fa-4890-952b-2a9f3e152ca3","Type":"ContainerDied","Data":"455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.781703 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t78tx" event={"ID":"139768c1-c8fa-4890-952b-2a9f3e152ca3","Type":"ContainerDied","Data":"070f7ce915aa86c7d1c3e0233e4733354dce6c39d63c09b6188e749cf8d26a64"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.781757 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t78tx" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.786749 4803 scope.go:117] "RemoveContainer" containerID="8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.786999 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-s92b2"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.787124 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567122-nz79x" event={"ID":"c857ab36-26c2-4a5a-8ab3-c92ef681c2ac","Type":"ContainerDied","Data":"f66a517dfdb44f6eb3587115aeab95766df0ca3e66a5c8714c1a74832daf9840"} Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.787156 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f66a517dfdb44f6eb3587115aeab95766df0ca3e66a5c8714c1a74832daf9840" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.787207 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567122-nz79x" Mar 20 17:22:04 crc kubenswrapper[4803]: E0320 17:22:04.787442 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713\": container with ID starting with 8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713 not found: ID does not exist" containerID="8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.787480 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713"} err="failed to get container status \"8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713\": rpc error: code = NotFound desc = could not find container \"8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713\": container with ID starting with 8e56d3a1b82cf3d68a6650b67560bcea31506c73816d9ddf8e0bd151af8e5713 not found: ID does not exist" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.787511 4803 scope.go:117] "RemoveContainer" containerID="c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.813450 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g6tsw"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.819373 4803 scope.go:117] "RemoveContainer" containerID="31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.823462 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-g6tsw"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.829115 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjntl"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.835361 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gjntl"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.842565 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t78tx"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.853731 4803 scope.go:117] "RemoveContainer" containerID="64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.867963 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" path="/var/lib/kubelet/pods/46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3/volumes" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.868728 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a15cf7-9fc3-45dc-a960-a85e930f8365" path="/var/lib/kubelet/pods/50a15cf7-9fc3-45dc-a960-a85e930f8365/volumes" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.869231 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t78tx"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.873277 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gddcq"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.877946 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gddcq"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.879787 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr2dq"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.882609 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mr2dq"] Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.902153 4803 scope.go:117] "RemoveContainer" containerID="c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135" Mar 20 17:22:04 crc kubenswrapper[4803]: E0320 17:22:04.902538 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135\": container with ID starting with c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135 not found: ID does not exist" containerID="c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.902579 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135"} err="failed to get container status \"c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135\": rpc error: code = NotFound desc = could not find container \"c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135\": container with ID starting with c4ee69b8db6a42987560e8dbd9f32f08e52ef375241b5b1030053b94cf73a135 not found: ID does not exist" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.902607 4803 scope.go:117] "RemoveContainer" containerID="31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a" Mar 20 17:22:04 crc kubenswrapper[4803]: E0320 17:22:04.902906 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a\": container with ID starting with 31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a not found: ID does not exist" containerID="31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.902957 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a"} err="failed to get container status \"31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a\": rpc error: code = NotFound desc = could not find container \"31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a\": container with ID starting with 31917b3a159520c67b3740548016deb67ffb7d9b1b0ce4f261999afb4168c53a not found: ID does not exist" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.902991 4803 scope.go:117] "RemoveContainer" containerID="64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984" Mar 20 17:22:04 crc kubenswrapper[4803]: E0320 17:22:04.903385 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984\": container with ID starting with 64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984 not found: ID does not exist" containerID="64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.903408 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984"} err="failed to get container status \"64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984\": rpc error: code = NotFound desc = could not find container \"64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984\": container with ID starting with 64359fb93f3ad4f3e78e91aed574db53036ebf1ebcd2aeb6cb60922428a77984 not found: ID does not exist" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.903421 4803 scope.go:117] "RemoveContainer" containerID="8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.919144 4803 scope.go:117] "RemoveContainer" containerID="dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.930622 4803 scope.go:117] "RemoveContainer" containerID="25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.943199 4803 scope.go:117] "RemoveContainer" containerID="8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f" Mar 20 17:22:04 crc kubenswrapper[4803]: E0320 17:22:04.943493 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f\": container with ID starting with 8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f not found: ID does not exist" containerID="8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.943561 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f"} err="failed to get container status \"8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f\": rpc error: code = NotFound desc = could not find container \"8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f\": container with ID starting with 8bc52517c15ba9df195d024714c8f3253837f59bb3de7a673e200088b1d81c3f not found: ID does not exist" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.943590 4803 scope.go:117] "RemoveContainer" containerID="dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a" Mar 20 17:22:04 crc kubenswrapper[4803]: E0320 17:22:04.943793 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a\": container with ID starting with dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a not found: ID does not exist" containerID="dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.943819 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a"} err="failed to get container status \"dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a\": rpc error: code = NotFound desc = could not find container \"dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a\": container with ID starting with dcfbcd496420de20f4ff712742481b1e9569b6f6daeae9b758e89fe2c64b457a not found: ID does not exist" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.943837 4803 scope.go:117] "RemoveContainer" containerID="25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79" Mar 20 17:22:04 crc kubenswrapper[4803]: E0320 17:22:04.944102 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79\": container with ID starting with 25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79 not found: ID does not exist" containerID="25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.944149 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79"} err="failed to get container status \"25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79\": rpc error: code = NotFound desc = could not find container \"25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79\": container with ID starting with 25496a12795aef22aba1e3c626a84ea8af003ebf493e2b062317717e35da3b79 not found: ID does not exist" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.944182 4803 scope.go:117] "RemoveContainer" containerID="f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b" Mar 20 17:22:04 crc kubenswrapper[4803]: I0320 17:22:04.988463 4803 scope.go:117] "RemoveContainer" containerID="765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.004124 4803 scope.go:117] "RemoveContainer" containerID="9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.025889 4803 scope.go:117] "RemoveContainer" containerID="f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b" Mar 20 17:22:05 crc kubenswrapper[4803]: E0320 17:22:05.026335 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b\": container with ID starting with f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b not found: ID does not exist" containerID="f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.026367 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b"} err="failed to get container status \"f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b\": rpc error: code = NotFound desc = could not find container \"f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b\": container with ID starting with f89af609408fc002ca9839942e09fd335d9ab69e16801987af13a7e49492cc7b not found: ID does not exist" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.026389 4803 scope.go:117] "RemoveContainer" containerID="765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3" Mar 20 17:22:05 crc kubenswrapper[4803]: E0320 17:22:05.026615 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3\": container with ID starting with 765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3 not found: ID does not exist" containerID="765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.026660 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3"} err="failed to get container status \"765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3\": rpc error: code = NotFound desc = could not find container \"765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3\": container with ID starting with 765b7dac51332637513dc9c0a256dc89061c6b78af2943b22c439dc6728e6df3 not found: ID does not exist" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.026676 4803 scope.go:117] "RemoveContainer" containerID="9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e" Mar 20 17:22:05 crc kubenswrapper[4803]: E0320 17:22:05.027354 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e\": container with ID starting with 9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e not found: ID does not exist" containerID="9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.027377 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e"} err="failed to get container status \"9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e\": rpc error: code = NotFound desc = could not find container \"9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e\": container with ID starting with 9818fe3dff1d0bf2cd114439d7b1a4ec8e0a6b568b66c9e7c938403abca6109e not found: ID does not exist" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.027390 4803 scope.go:117] "RemoveContainer" containerID="455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.044079 4803 scope.go:117] "RemoveContainer" containerID="982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.063487 4803 scope.go:117] "RemoveContainer" containerID="d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.084982 4803 scope.go:117] "RemoveContainer" containerID="455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c" Mar 20 17:22:05 crc kubenswrapper[4803]: E0320 17:22:05.085439 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c\": container with ID starting with 455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c not found: ID does not exist" containerID="455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.085467 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c"} err="failed to get container status \"455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c\": rpc error: code = NotFound desc = could not find container \"455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c\": container with ID starting with 455a7f2091027d44b6f0221b1f9310e5cb7fb34f507b58b652b8a61a38dd053c not found: ID does not exist" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.085487 4803 scope.go:117] "RemoveContainer" containerID="982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e" Mar 20 17:22:05 crc kubenswrapper[4803]: E0320 17:22:05.085952 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e\": container with ID starting with 982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e not found: ID does not exist" containerID="982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.085974 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e"} err="failed to get container status \"982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e\": rpc error: code = NotFound desc = could not find container \"982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e\": container with ID starting with 982989d2ce4cbe09cf5b7ab029d0cf2bfa499f82bd34d52c70a1e67439649d6e not found: ID does not exist" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.085989 4803 scope.go:117] "RemoveContainer" containerID="d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1" Mar 20 17:22:05 crc kubenswrapper[4803]: E0320 17:22:05.086200 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1\": container with ID starting with d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1 not found: ID does not exist" containerID="d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.086221 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1"} err="failed to get container status \"d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1\": rpc error: code = NotFound desc = could not find container \"d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1\": container with ID starting with d5b19d0e89d0e7e7e80e501a0a2a27d1f1bc01d0a25a2eded93d84334e41cea1 not found: ID does not exist" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.795612 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" event={"ID":"617ba1c8-b42d-4e9c-8d5b-6a903f267358","Type":"ContainerStarted","Data":"42ab2cf28cd5ceb52fc43296868d95fb9ca5db0062371b663a163f969ee06500"} Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.795658 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" event={"ID":"617ba1c8-b42d-4e9c-8d5b-6a903f267358","Type":"ContainerStarted","Data":"70db51188628dc72584b6ebddb953fee452e4dd8a478b46154d0eca48686b32c"} Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.795886 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.800128 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" Mar 20 17:22:05 crc kubenswrapper[4803]: I0320 17:22:05.825835 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-s92b2" podStartSLOduration=2.825808236 podStartE2EDuration="2.825808236s" podCreationTimestamp="2026-03-20 17:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:22:05.822861038 +0000 UTC m=+335.734453198" watchObservedRunningTime="2026-03-20 17:22:05.825808236 +0000 UTC m=+335.737400386" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.858805 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" path="/var/lib/kubelet/pods/139768c1-c8fa-4890-952b-2a9f3e152ca3/volumes" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.860026 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" path="/var/lib/kubelet/pods/a5db8851-4faf-41b9-9f19-56ae943e1f07/volumes" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.860815 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" path="/var/lib/kubelet/pods/baf113de-0d1a-4ddf-9ed5-01e25b1bb66e/volumes" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890212 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4m7j6"] Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890380 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="extract-utilities" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890392 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="extract-utilities" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890403 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerName="extract-content" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890409 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerName="extract-content" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890420 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890426 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890434 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890441 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890451 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerName="extract-content" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890456 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerName="extract-content" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890464 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="extract-content" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890469 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="extract-content" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890480 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerName="extract-utilities" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890486 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerName="extract-utilities" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890493 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c857ab36-26c2-4a5a-8ab3-c92ef681c2ac" containerName="oc" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890498 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c857ab36-26c2-4a5a-8ab3-c92ef681c2ac" containerName="oc" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890505 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a15cf7-9fc3-45dc-a960-a85e930f8365" containerName="marketplace-operator" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890511 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a15cf7-9fc3-45dc-a960-a85e930f8365" containerName="marketplace-operator" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890534 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890542 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890552 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerName="extract-content" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890558 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerName="extract-content" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890565 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerName="extract-utilities" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890572 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerName="extract-utilities" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890581 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerName="extract-utilities" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890596 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerName="extract-utilities" Mar 20 17:22:06 crc kubenswrapper[4803]: E0320 17:22:06.890605 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890611 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890700 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="46aaed67-fb9c-41fa-9f9f-9bfb3b2dfbf3" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890713 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf113de-0d1a-4ddf-9ed5-01e25b1bb66e" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890723 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c857ab36-26c2-4a5a-8ab3-c92ef681c2ac" containerName="oc" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890731 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a15cf7-9fc3-45dc-a960-a85e930f8365" containerName="marketplace-operator" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890737 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5db8851-4faf-41b9-9f19-56ae943e1f07" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.890745 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="139768c1-c8fa-4890-952b-2a9f3e152ca3" containerName="registry-server" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.891351 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.893926 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.903141 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-utilities\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.903253 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94xg8\" (UniqueName: \"kubernetes.io/projected/ac03c75a-1844-4a22-9a24-4fa1720906be-kube-api-access-94xg8\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.903318 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-catalog-content\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:06 crc kubenswrapper[4803]: I0320 17:22:06.916173 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4m7j6"] Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.005075 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-utilities\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.005164 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94xg8\" (UniqueName: \"kubernetes.io/projected/ac03c75a-1844-4a22-9a24-4fa1720906be-kube-api-access-94xg8\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.005205 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-catalog-content\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.005834 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-utilities\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.005851 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-catalog-content\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.045914 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94xg8\" (UniqueName: \"kubernetes.io/projected/ac03c75a-1844-4a22-9a24-4fa1720906be-kube-api-access-94xg8\") pod \"community-operators-4m7j6\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.066091 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hbxm"] Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.067679 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.075109 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.089828 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hbxm"] Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.106474 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-utilities\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.106642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6flqq\" (UniqueName: \"kubernetes.io/projected/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-kube-api-access-6flqq\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.106726 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-catalog-content\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.208114 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6flqq\" (UniqueName: \"kubernetes.io/projected/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-kube-api-access-6flqq\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.208322 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-catalog-content\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.208387 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-utilities\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.209269 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-utilities\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.209359 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-catalog-content\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.212015 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.242773 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6flqq\" (UniqueName: \"kubernetes.io/projected/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-kube-api-access-6flqq\") pod \"certified-operators-2hbxm\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.401084 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.662428 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4m7j6"] Mar 20 17:22:07 crc kubenswrapper[4803]: W0320 17:22:07.670863 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac03c75a_1844_4a22_9a24_4fa1720906be.slice/crio-38cf0412c8af63a63dda2198861754884a3fec269de85be635b838bc52218ae6 WatchSource:0}: Error finding container 38cf0412c8af63a63dda2198861754884a3fec269de85be635b838bc52218ae6: Status 404 returned error can't find the container with id 38cf0412c8af63a63dda2198861754884a3fec269de85be635b838bc52218ae6 Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.822240 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7j6" event={"ID":"ac03c75a-1844-4a22-9a24-4fa1720906be","Type":"ContainerStarted","Data":"8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8"} Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.822288 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7j6" event={"ID":"ac03c75a-1844-4a22-9a24-4fa1720906be","Type":"ContainerStarted","Data":"38cf0412c8af63a63dda2198861754884a3fec269de85be635b838bc52218ae6"} Mar 20 17:22:07 crc kubenswrapper[4803]: I0320 17:22:07.881739 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hbxm"] Mar 20 17:22:07 crc kubenswrapper[4803]: W0320 17:22:07.895708 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1eeda39_65c5_416d_9ad7_1de7c20e49f7.slice/crio-4e276ff67fe28843ac299db8ed322860c82d5aaa81c24ce20a01cf9a3e14af1e WatchSource:0}: Error finding container 4e276ff67fe28843ac299db8ed322860c82d5aaa81c24ce20a01cf9a3e14af1e: Status 404 returned error can't find the container with id 4e276ff67fe28843ac299db8ed322860c82d5aaa81c24ce20a01cf9a3e14af1e Mar 20 17:22:08 crc kubenswrapper[4803]: I0320 17:22:08.832787 4803 generic.go:334] "Generic (PLEG): container finished" podID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerID="0582dca2ee1d31a8801270f1a7afa0eedaa47141b11444525247240a29a52b39" exitCode=0 Mar 20 17:22:08 crc kubenswrapper[4803]: I0320 17:22:08.833046 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hbxm" event={"ID":"d1eeda39-65c5-416d-9ad7-1de7c20e49f7","Type":"ContainerDied","Data":"0582dca2ee1d31a8801270f1a7afa0eedaa47141b11444525247240a29a52b39"} Mar 20 17:22:08 crc kubenswrapper[4803]: I0320 17:22:08.833367 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hbxm" event={"ID":"d1eeda39-65c5-416d-9ad7-1de7c20e49f7","Type":"ContainerStarted","Data":"4e276ff67fe28843ac299db8ed322860c82d5aaa81c24ce20a01cf9a3e14af1e"} Mar 20 17:22:08 crc kubenswrapper[4803]: I0320 17:22:08.836139 4803 generic.go:334] "Generic (PLEG): container finished" podID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerID="8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8" exitCode=0 Mar 20 17:22:08 crc kubenswrapper[4803]: I0320 17:22:08.836233 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7j6" event={"ID":"ac03c75a-1844-4a22-9a24-4fa1720906be","Type":"ContainerDied","Data":"8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8"} Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.305002 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vsx7r"] Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.307027 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.310472 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.342812 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsx7r"] Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.348907 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66687ffe-614d-4427-a236-13b8623bbd4c-catalog-content\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.348972 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66687ffe-614d-4427-a236-13b8623bbd4c-utilities\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.349052 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcc4b\" (UniqueName: \"kubernetes.io/projected/66687ffe-614d-4427-a236-13b8623bbd4c-kube-api-access-jcc4b\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.450696 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66687ffe-614d-4427-a236-13b8623bbd4c-catalog-content\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.450893 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66687ffe-614d-4427-a236-13b8623bbd4c-utilities\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.451179 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcc4b\" (UniqueName: \"kubernetes.io/projected/66687ffe-614d-4427-a236-13b8623bbd4c-kube-api-access-jcc4b\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.451940 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66687ffe-614d-4427-a236-13b8623bbd4c-catalog-content\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.452115 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66687ffe-614d-4427-a236-13b8623bbd4c-utilities\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.479779 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcc4b\" (UniqueName: \"kubernetes.io/projected/66687ffe-614d-4427-a236-13b8623bbd4c-kube-api-access-jcc4b\") pod \"redhat-marketplace-vsx7r\" (UID: \"66687ffe-614d-4427-a236-13b8623bbd4c\") " pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.481466 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xnbtp"] Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.483419 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.486714 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.489587 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xnbtp"] Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.552307 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c920780-df66-4654-a41c-b178a85a7885-utilities\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.552361 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c920780-df66-4654-a41c-b178a85a7885-catalog-content\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.552400 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtdl\" (UniqueName: \"kubernetes.io/projected/3c920780-df66-4654-a41c-b178a85a7885-kube-api-access-ldtdl\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.646649 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.653115 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c920780-df66-4654-a41c-b178a85a7885-utilities\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.653166 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c920780-df66-4654-a41c-b178a85a7885-catalog-content\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.653248 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtdl\" (UniqueName: \"kubernetes.io/projected/3c920780-df66-4654-a41c-b178a85a7885-kube-api-access-ldtdl\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.654010 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c920780-df66-4654-a41c-b178a85a7885-utilities\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.654318 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c920780-df66-4654-a41c-b178a85a7885-catalog-content\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.680168 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtdl\" (UniqueName: \"kubernetes.io/projected/3c920780-df66-4654-a41c-b178a85a7885-kube-api-access-ldtdl\") pod \"redhat-operators-xnbtp\" (UID: \"3c920780-df66-4654-a41c-b178a85a7885\") " pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.826050 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.844646 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7j6" event={"ID":"ac03c75a-1844-4a22-9a24-4fa1720906be","Type":"ContainerStarted","Data":"afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964"} Mar 20 17:22:09 crc kubenswrapper[4803]: I0320 17:22:09.850633 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hbxm" event={"ID":"d1eeda39-65c5-416d-9ad7-1de7c20e49f7","Type":"ContainerStarted","Data":"eb5434a31ce7079dbc36cbbc9b2004e38a5675dbfc5fa61195f71f2100c86798"} Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.064508 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vsx7r"] Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.862588 4803 generic.go:334] "Generic (PLEG): container finished" podID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerID="eb5434a31ce7079dbc36cbbc9b2004e38a5675dbfc5fa61195f71f2100c86798" exitCode=0 Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.862747 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hbxm" event={"ID":"d1eeda39-65c5-416d-9ad7-1de7c20e49f7","Type":"ContainerDied","Data":"eb5434a31ce7079dbc36cbbc9b2004e38a5675dbfc5fa61195f71f2100c86798"} Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.868428 4803 generic.go:334] "Generic (PLEG): container finished" podID="66687ffe-614d-4427-a236-13b8623bbd4c" containerID="19508e771faa05f9f96de33f3d7dbc7f3c701f7509252448cc8cf3455d9a4888" exitCode=0 Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.868567 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsx7r" event={"ID":"66687ffe-614d-4427-a236-13b8623bbd4c","Type":"ContainerDied","Data":"19508e771faa05f9f96de33f3d7dbc7f3c701f7509252448cc8cf3455d9a4888"} Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.868606 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsx7r" event={"ID":"66687ffe-614d-4427-a236-13b8623bbd4c","Type":"ContainerStarted","Data":"51152d574517af2c5b03218b297019130a2650d3caa7844b264587e23955ba77"} Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.880818 4803 generic.go:334] "Generic (PLEG): container finished" podID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerID="afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964" exitCode=0 Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.880903 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7j6" event={"ID":"ac03c75a-1844-4a22-9a24-4fa1720906be","Type":"ContainerDied","Data":"afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964"} Mar 20 17:22:10 crc kubenswrapper[4803]: I0320 17:22:10.980878 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xnbtp"] Mar 20 17:22:11 crc kubenswrapper[4803]: I0320 17:22:11.891823 4803 generic.go:334] "Generic (PLEG): container finished" podID="3c920780-df66-4654-a41c-b178a85a7885" containerID="dec63b78db01d0df765c998763b7fbf08653683b3c41bc5ff5b4cb5065512e18" exitCode=0 Mar 20 17:22:11 crc kubenswrapper[4803]: I0320 17:22:11.892193 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xnbtp" event={"ID":"3c920780-df66-4654-a41c-b178a85a7885","Type":"ContainerDied","Data":"dec63b78db01d0df765c998763b7fbf08653683b3c41bc5ff5b4cb5065512e18"} Mar 20 17:22:11 crc kubenswrapper[4803]: I0320 17:22:11.892231 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xnbtp" event={"ID":"3c920780-df66-4654-a41c-b178a85a7885","Type":"ContainerStarted","Data":"a727c5c0039433f2b0e6b4b9d664b15e133d0476c08f856cbf5cc6bbf87594f9"} Mar 20 17:22:11 crc kubenswrapper[4803]: I0320 17:22:11.896456 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsx7r" event={"ID":"66687ffe-614d-4427-a236-13b8623bbd4c","Type":"ContainerStarted","Data":"ec288ef1de4d0a8c35ef9c5cb355fa96e392453f8ecb59481ae04d6836581121"} Mar 20 17:22:11 crc kubenswrapper[4803]: I0320 17:22:11.900490 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7j6" event={"ID":"ac03c75a-1844-4a22-9a24-4fa1720906be","Type":"ContainerStarted","Data":"fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350"} Mar 20 17:22:11 crc kubenswrapper[4803]: I0320 17:22:11.903136 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hbxm" event={"ID":"d1eeda39-65c5-416d-9ad7-1de7c20e49f7","Type":"ContainerStarted","Data":"3f7841838c08208c6fee3631458944ecb2c6989a1b04d346a9b6050191873234"} Mar 20 17:22:12 crc kubenswrapper[4803]: I0320 17:22:12.000758 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4m7j6" podStartSLOduration=3.42982593 podStartE2EDuration="6.000742801s" podCreationTimestamp="2026-03-20 17:22:06 +0000 UTC" firstStartedPulling="2026-03-20 17:22:08.839205747 +0000 UTC m=+338.750797857" lastFinishedPulling="2026-03-20 17:22:11.410122618 +0000 UTC m=+341.321714728" observedRunningTime="2026-03-20 17:22:11.998328839 +0000 UTC m=+341.909920949" watchObservedRunningTime="2026-03-20 17:22:12.000742801 +0000 UTC m=+341.912334871" Mar 20 17:22:12 crc kubenswrapper[4803]: I0320 17:22:12.016889 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hbxm" podStartSLOduration=2.508900885 podStartE2EDuration="5.016873819s" podCreationTimestamp="2026-03-20 17:22:07 +0000 UTC" firstStartedPulling="2026-03-20 17:22:08.835655722 +0000 UTC m=+338.747247832" lastFinishedPulling="2026-03-20 17:22:11.343628666 +0000 UTC m=+341.255220766" observedRunningTime="2026-03-20 17:22:12.014239781 +0000 UTC m=+341.925831861" watchObservedRunningTime="2026-03-20 17:22:12.016873819 +0000 UTC m=+341.928465889" Mar 20 17:22:12 crc kubenswrapper[4803]: I0320 17:22:12.911849 4803 generic.go:334] "Generic (PLEG): container finished" podID="66687ffe-614d-4427-a236-13b8623bbd4c" containerID="ec288ef1de4d0a8c35ef9c5cb355fa96e392453f8ecb59481ae04d6836581121" exitCode=0 Mar 20 17:22:12 crc kubenswrapper[4803]: I0320 17:22:12.911963 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsx7r" event={"ID":"66687ffe-614d-4427-a236-13b8623bbd4c","Type":"ContainerDied","Data":"ec288ef1de4d0a8c35ef9c5cb355fa96e392453f8ecb59481ae04d6836581121"} Mar 20 17:22:13 crc kubenswrapper[4803]: I0320 17:22:13.921179 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vsx7r" event={"ID":"66687ffe-614d-4427-a236-13b8623bbd4c","Type":"ContainerStarted","Data":"92e2b51cc62546aefcab66a454daa8739ba9273c229dfaaba38e184500cff9f6"} Mar 20 17:22:13 crc kubenswrapper[4803]: I0320 17:22:13.924051 4803 generic.go:334] "Generic (PLEG): container finished" podID="3c920780-df66-4654-a41c-b178a85a7885" containerID="9f500e32499cc9650266abbafc745198cf7a809e7351f182ce7f206faff5bce2" exitCode=0 Mar 20 17:22:13 crc kubenswrapper[4803]: I0320 17:22:13.924092 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xnbtp" event={"ID":"3c920780-df66-4654-a41c-b178a85a7885","Type":"ContainerDied","Data":"9f500e32499cc9650266abbafc745198cf7a809e7351f182ce7f206faff5bce2"} Mar 20 17:22:13 crc kubenswrapper[4803]: I0320 17:22:13.952975 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vsx7r" podStartSLOduration=2.508042011 podStartE2EDuration="4.952952056s" podCreationTimestamp="2026-03-20 17:22:09 +0000 UTC" firstStartedPulling="2026-03-20 17:22:10.872089984 +0000 UTC m=+340.783682084" lastFinishedPulling="2026-03-20 17:22:13.317000049 +0000 UTC m=+343.228592129" observedRunningTime="2026-03-20 17:22:13.9517287 +0000 UTC m=+343.863320790" watchObservedRunningTime="2026-03-20 17:22:13.952952056 +0000 UTC m=+343.864544146" Mar 20 17:22:14 crc kubenswrapper[4803]: I0320 17:22:14.934681 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xnbtp" event={"ID":"3c920780-df66-4654-a41c-b178a85a7885","Type":"ContainerStarted","Data":"99e728be146da16a8ec910bb9a0635ee75f3030a680f0f6163a6b6c83e3e0647"} Mar 20 17:22:14 crc kubenswrapper[4803]: I0320 17:22:14.962584 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xnbtp" podStartSLOduration=3.192687363 podStartE2EDuration="5.962563663s" podCreationTimestamp="2026-03-20 17:22:09 +0000 UTC" firstStartedPulling="2026-03-20 17:22:11.894376217 +0000 UTC m=+341.805968287" lastFinishedPulling="2026-03-20 17:22:14.664252497 +0000 UTC m=+344.575844587" observedRunningTime="2026-03-20 17:22:14.957886634 +0000 UTC m=+344.869478714" watchObservedRunningTime="2026-03-20 17:22:14.962563663 +0000 UTC m=+344.874155743" Mar 20 17:22:17 crc kubenswrapper[4803]: I0320 17:22:17.212581 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:17 crc kubenswrapper[4803]: I0320 17:22:17.213101 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:17 crc kubenswrapper[4803]: I0320 17:22:17.281881 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:17 crc kubenswrapper[4803]: I0320 17:22:17.402634 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:17 crc kubenswrapper[4803]: I0320 17:22:17.402690 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:17 crc kubenswrapper[4803]: I0320 17:22:17.445948 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:18 crc kubenswrapper[4803]: I0320 17:22:18.035699 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:22:18 crc kubenswrapper[4803]: I0320 17:22:18.037125 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:22:19 crc kubenswrapper[4803]: I0320 17:22:19.647476 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:19 crc kubenswrapper[4803]: I0320 17:22:19.648579 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:19 crc kubenswrapper[4803]: I0320 17:22:19.682161 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:19 crc kubenswrapper[4803]: I0320 17:22:19.827136 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:19 crc kubenswrapper[4803]: I0320 17:22:19.827212 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:19 crc kubenswrapper[4803]: I0320 17:22:19.972095 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-cw85j" Mar 20 17:22:20 crc kubenswrapper[4803]: I0320 17:22:20.053184 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwfxw"] Mar 20 17:22:20 crc kubenswrapper[4803]: I0320 17:22:20.064027 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vsx7r" Mar 20 17:22:20 crc kubenswrapper[4803]: I0320 17:22:20.894771 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xnbtp" podUID="3c920780-df66-4654-a41c-b178a85a7885" containerName="registry-server" probeResult="failure" output=< Mar 20 17:22:20 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:22:20 crc kubenswrapper[4803]: > Mar 20 17:22:29 crc kubenswrapper[4803]: I0320 17:22:29.884846 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:29 crc kubenswrapper[4803]: I0320 17:22:29.944522 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xnbtp" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.108062 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" podUID="4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" containerName="registry" containerID="cri-o://18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0" gracePeriod=30 Mar 20 17:22:45 crc kubenswrapper[4803]: E0320 17:22:45.174702 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbc5db9_573c_4314_9ebc_7b3e9e45f5bd.slice/crio-18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.608976 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.682948 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-trusted-ca\") pod \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.683005 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2jn6\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-kube-api-access-z2jn6\") pod \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.683217 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.683875 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-ca-trust-extracted\") pod \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.683962 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-installation-pull-secrets\") pod \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.684036 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-tls\") pod \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.684147 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-bound-sa-token\") pod \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.684280 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-certificates\") pod \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\" (UID: \"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd\") " Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.684379 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.688413 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.689299 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.694628 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.699359 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.701651 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-kube-api-access-z2jn6" (OuterVolumeSpecName: "kube-api-access-z2jn6") pod "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd"). InnerVolumeSpecName "kube-api-access-z2jn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.702212 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.705691 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.713704 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" (UID: "4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.789447 4803 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.789774 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2jn6\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-kube-api-access-z2jn6\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.789787 4803 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.789799 4803 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.789812 4803 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:45 crc kubenswrapper[4803]: I0320 17:22:45.789831 4803 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.134131 4803 generic.go:334] "Generic (PLEG): container finished" podID="4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" containerID="18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0" exitCode=0 Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.134247 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.134260 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" event={"ID":"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd","Type":"ContainerDied","Data":"18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0"} Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.135284 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kwfxw" event={"ID":"4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd","Type":"ContainerDied","Data":"98f62eaf81b9d7d5110dafc6195e0d2fc28b5e0505d2157ae5839ad6452fc5c7"} Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.135313 4803 scope.go:117] "RemoveContainer" containerID="18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0" Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.158806 4803 scope.go:117] "RemoveContainer" containerID="18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0" Mar 20 17:22:46 crc kubenswrapper[4803]: E0320 17:22:46.159351 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0\": container with ID starting with 18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0 not found: ID does not exist" containerID="18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0" Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.159388 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0"} err="failed to get container status \"18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0\": rpc error: code = NotFound desc = could not find container \"18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0\": container with ID starting with 18d8ebf49c1255436794b67cb8b1293cb67634f34fd94e05b7e34d26f0f279d0 not found: ID does not exist" Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.174371 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwfxw"] Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.178198 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kwfxw"] Mar 20 17:22:46 crc kubenswrapper[4803]: I0320 17:22:46.860513 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" path="/var/lib/kubelet/pods/4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd/volumes" Mar 20 17:23:08 crc kubenswrapper[4803]: I0320 17:23:08.246473 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:23:08 crc kubenswrapper[4803]: I0320 17:23:08.246935 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:23:38 crc kubenswrapper[4803]: I0320 17:23:38.247499 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:23:38 crc kubenswrapper[4803]: I0320 17:23:38.248232 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.135549 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567124-tmtll"] Mar 20 17:24:00 crc kubenswrapper[4803]: E0320 17:24:00.136460 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" containerName="registry" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.136483 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" containerName="registry" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.136723 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbc5db9-573c-4314-9ebc-7b3e9e45f5bd" containerName="registry" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.137262 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-tmtll" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.140804 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.141021 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.141536 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.143169 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-tmtll"] Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.289932 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrwtc\" (UniqueName: \"kubernetes.io/projected/28cdfb98-cfe7-42a7-8ee0-84d618800c5b-kube-api-access-rrwtc\") pod \"auto-csr-approver-29567124-tmtll\" (UID: \"28cdfb98-cfe7-42a7-8ee0-84d618800c5b\") " pod="openshift-infra/auto-csr-approver-29567124-tmtll" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.391178 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrwtc\" (UniqueName: \"kubernetes.io/projected/28cdfb98-cfe7-42a7-8ee0-84d618800c5b-kube-api-access-rrwtc\") pod \"auto-csr-approver-29567124-tmtll\" (UID: \"28cdfb98-cfe7-42a7-8ee0-84d618800c5b\") " pod="openshift-infra/auto-csr-approver-29567124-tmtll" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.431855 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrwtc\" (UniqueName: \"kubernetes.io/projected/28cdfb98-cfe7-42a7-8ee0-84d618800c5b-kube-api-access-rrwtc\") pod \"auto-csr-approver-29567124-tmtll\" (UID: \"28cdfb98-cfe7-42a7-8ee0-84d618800c5b\") " pod="openshift-infra/auto-csr-approver-29567124-tmtll" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.455115 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-tmtll" Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.686028 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-tmtll"] Mar 20 17:24:00 crc kubenswrapper[4803]: I0320 17:24:00.705362 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:24:01 crc kubenswrapper[4803]: I0320 17:24:01.665202 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-tmtll" event={"ID":"28cdfb98-cfe7-42a7-8ee0-84d618800c5b","Type":"ContainerStarted","Data":"9e2a5ba493b86b6460a1f7d0f09e0cbb983258a1d16db46139a90b4d29324d53"} Mar 20 17:24:02 crc kubenswrapper[4803]: I0320 17:24:02.674256 4803 generic.go:334] "Generic (PLEG): container finished" podID="28cdfb98-cfe7-42a7-8ee0-84d618800c5b" containerID="cc9b198f8d5a17014c5ef6d250ebdc17c859a9cdb5adb8c043470954f0e136c5" exitCode=0 Mar 20 17:24:02 crc kubenswrapper[4803]: I0320 17:24:02.674309 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-tmtll" event={"ID":"28cdfb98-cfe7-42a7-8ee0-84d618800c5b","Type":"ContainerDied","Data":"cc9b198f8d5a17014c5ef6d250ebdc17c859a9cdb5adb8c043470954f0e136c5"} Mar 20 17:24:03 crc kubenswrapper[4803]: I0320 17:24:03.960299 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-tmtll" Mar 20 17:24:04 crc kubenswrapper[4803]: I0320 17:24:04.041183 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrwtc\" (UniqueName: \"kubernetes.io/projected/28cdfb98-cfe7-42a7-8ee0-84d618800c5b-kube-api-access-rrwtc\") pod \"28cdfb98-cfe7-42a7-8ee0-84d618800c5b\" (UID: \"28cdfb98-cfe7-42a7-8ee0-84d618800c5b\") " Mar 20 17:24:04 crc kubenswrapper[4803]: I0320 17:24:04.050200 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28cdfb98-cfe7-42a7-8ee0-84d618800c5b-kube-api-access-rrwtc" (OuterVolumeSpecName: "kube-api-access-rrwtc") pod "28cdfb98-cfe7-42a7-8ee0-84d618800c5b" (UID: "28cdfb98-cfe7-42a7-8ee0-84d618800c5b"). InnerVolumeSpecName "kube-api-access-rrwtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:24:04 crc kubenswrapper[4803]: I0320 17:24:04.143063 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrwtc\" (UniqueName: \"kubernetes.io/projected/28cdfb98-cfe7-42a7-8ee0-84d618800c5b-kube-api-access-rrwtc\") on node \"crc\" DevicePath \"\"" Mar 20 17:24:04 crc kubenswrapper[4803]: I0320 17:24:04.689205 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567124-tmtll" event={"ID":"28cdfb98-cfe7-42a7-8ee0-84d618800c5b","Type":"ContainerDied","Data":"9e2a5ba493b86b6460a1f7d0f09e0cbb983258a1d16db46139a90b4d29324d53"} Mar 20 17:24:04 crc kubenswrapper[4803]: I0320 17:24:04.689259 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e2a5ba493b86b6460a1f7d0f09e0cbb983258a1d16db46139a90b4d29324d53" Mar 20 17:24:04 crc kubenswrapper[4803]: I0320 17:24:04.689583 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567124-tmtll" Mar 20 17:24:05 crc kubenswrapper[4803]: I0320 17:24:05.054966 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567118-clg6s"] Mar 20 17:24:05 crc kubenswrapper[4803]: I0320 17:24:05.058467 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567118-clg6s"] Mar 20 17:24:06 crc kubenswrapper[4803]: I0320 17:24:06.861557 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96658fb9-4742-457e-b7ec-384ef06ec6a8" path="/var/lib/kubelet/pods/96658fb9-4742-457e-b7ec-384ef06ec6a8/volumes" Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.246649 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.247026 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.247101 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.249104 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45b15f7f930be75ad9338760e525031847e940aad8524904031331b60994207d"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.249205 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://45b15f7f930be75ad9338760e525031847e940aad8524904031331b60994207d" gracePeriod=600 Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.718403 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="45b15f7f930be75ad9338760e525031847e940aad8524904031331b60994207d" exitCode=0 Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.718548 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"45b15f7f930be75ad9338760e525031847e940aad8524904031331b60994207d"} Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.718812 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"dab0fb0f3d947efe9367eb4b30fc777ab2c8a308cb11da87dd8cf745d61564c4"} Mar 20 17:24:08 crc kubenswrapper[4803]: I0320 17:24:08.718839 4803 scope.go:117] "RemoveContainer" containerID="65940b38653f640196377cdf6c52dabac29bf3708a6f546d7affca969f10c203" Mar 20 17:25:10 crc kubenswrapper[4803]: I0320 17:25:10.704478 4803 scope.go:117] "RemoveContainer" containerID="6feff3542e2c7ae784dcdafe4db6b281694f53922faa1e45fcedf5b961c198b9" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.149046 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567126-v4lng"] Mar 20 17:26:00 crc kubenswrapper[4803]: E0320 17:26:00.150106 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28cdfb98-cfe7-42a7-8ee0-84d618800c5b" containerName="oc" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.150129 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="28cdfb98-cfe7-42a7-8ee0-84d618800c5b" containerName="oc" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.150308 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="28cdfb98-cfe7-42a7-8ee0-84d618800c5b" containerName="oc" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.150902 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-v4lng" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.154442 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.155322 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.155453 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.164383 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-v4lng"] Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.191803 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjww\" (UniqueName: \"kubernetes.io/projected/5aa173fa-44b8-4141-8238-32c03c99cbbc-kube-api-access-gbjww\") pod \"auto-csr-approver-29567126-v4lng\" (UID: \"5aa173fa-44b8-4141-8238-32c03c99cbbc\") " pod="openshift-infra/auto-csr-approver-29567126-v4lng" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.292857 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjww\" (UniqueName: \"kubernetes.io/projected/5aa173fa-44b8-4141-8238-32c03c99cbbc-kube-api-access-gbjww\") pod \"auto-csr-approver-29567126-v4lng\" (UID: \"5aa173fa-44b8-4141-8238-32c03c99cbbc\") " pod="openshift-infra/auto-csr-approver-29567126-v4lng" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.317768 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjww\" (UniqueName: \"kubernetes.io/projected/5aa173fa-44b8-4141-8238-32c03c99cbbc-kube-api-access-gbjww\") pod \"auto-csr-approver-29567126-v4lng\" (UID: \"5aa173fa-44b8-4141-8238-32c03c99cbbc\") " pod="openshift-infra/auto-csr-approver-29567126-v4lng" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.477695 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-v4lng" Mar 20 17:26:00 crc kubenswrapper[4803]: I0320 17:26:00.713755 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-v4lng"] Mar 20 17:26:01 crc kubenswrapper[4803]: I0320 17:26:01.408323 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-v4lng" event={"ID":"5aa173fa-44b8-4141-8238-32c03c99cbbc","Type":"ContainerStarted","Data":"d4fc9573a4b2904b3a3c7aa6a681f27e1df235c92cd2f3d2bda836bb1d191a11"} Mar 20 17:26:02 crc kubenswrapper[4803]: I0320 17:26:02.419688 4803 generic.go:334] "Generic (PLEG): container finished" podID="5aa173fa-44b8-4141-8238-32c03c99cbbc" containerID="ba3a027d817d4d5f8a7593861130eb6761ed9fc3fbada53e331d2f88668691c5" exitCode=0 Mar 20 17:26:02 crc kubenswrapper[4803]: I0320 17:26:02.419918 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-v4lng" event={"ID":"5aa173fa-44b8-4141-8238-32c03c99cbbc","Type":"ContainerDied","Data":"ba3a027d817d4d5f8a7593861130eb6761ed9fc3fbada53e331d2f88668691c5"} Mar 20 17:26:03 crc kubenswrapper[4803]: I0320 17:26:03.745043 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-v4lng" Mar 20 17:26:03 crc kubenswrapper[4803]: I0320 17:26:03.844930 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjww\" (UniqueName: \"kubernetes.io/projected/5aa173fa-44b8-4141-8238-32c03c99cbbc-kube-api-access-gbjww\") pod \"5aa173fa-44b8-4141-8238-32c03c99cbbc\" (UID: \"5aa173fa-44b8-4141-8238-32c03c99cbbc\") " Mar 20 17:26:03 crc kubenswrapper[4803]: I0320 17:26:03.853209 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aa173fa-44b8-4141-8238-32c03c99cbbc-kube-api-access-gbjww" (OuterVolumeSpecName: "kube-api-access-gbjww") pod "5aa173fa-44b8-4141-8238-32c03c99cbbc" (UID: "5aa173fa-44b8-4141-8238-32c03c99cbbc"). InnerVolumeSpecName "kube-api-access-gbjww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:26:03 crc kubenswrapper[4803]: I0320 17:26:03.946909 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbjww\" (UniqueName: \"kubernetes.io/projected/5aa173fa-44b8-4141-8238-32c03c99cbbc-kube-api-access-gbjww\") on node \"crc\" DevicePath \"\"" Mar 20 17:26:04 crc kubenswrapper[4803]: I0320 17:26:04.438813 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567126-v4lng" event={"ID":"5aa173fa-44b8-4141-8238-32c03c99cbbc","Type":"ContainerDied","Data":"d4fc9573a4b2904b3a3c7aa6a681f27e1df235c92cd2f3d2bda836bb1d191a11"} Mar 20 17:26:04 crc kubenswrapper[4803]: I0320 17:26:04.438869 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4fc9573a4b2904b3a3c7aa6a681f27e1df235c92cd2f3d2bda836bb1d191a11" Mar 20 17:26:04 crc kubenswrapper[4803]: I0320 17:26:04.438943 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567126-v4lng" Mar 20 17:26:04 crc kubenswrapper[4803]: I0320 17:26:04.814915 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-kk4jn"] Mar 20 17:26:04 crc kubenswrapper[4803]: I0320 17:26:04.824217 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567120-kk4jn"] Mar 20 17:26:04 crc kubenswrapper[4803]: I0320 17:26:04.859492 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab12c524-2ba2-4454-8079-af3be98f4ccf" path="/var/lib/kubelet/pods/ab12c524-2ba2-4454-8079-af3be98f4ccf/volumes" Mar 20 17:26:08 crc kubenswrapper[4803]: I0320 17:26:08.246315 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:26:08 crc kubenswrapper[4803]: I0320 17:26:08.246610 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:26:10 crc kubenswrapper[4803]: I0320 17:26:10.744860 4803 scope.go:117] "RemoveContainer" containerID="9f8c50bdb93287488b1047c1770812d09f5cf3ff0e0a78b078991900d9029491" Mar 20 17:26:10 crc kubenswrapper[4803]: I0320 17:26:10.770964 4803 scope.go:117] "RemoveContainer" containerID="f989f6268faff4205221804cd6257a28d0e974cb2b4a57ee487950413532d5d3" Mar 20 17:26:10 crc kubenswrapper[4803]: I0320 17:26:10.806516 4803 scope.go:117] "RemoveContainer" containerID="e3af574ba8a9730f62410c91e8662dd86c731535ce0fdd64402ef40a0e98f056" Mar 20 17:26:38 crc kubenswrapper[4803]: I0320 17:26:38.246288 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:26:38 crc kubenswrapper[4803]: I0320 17:26:38.246893 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.246146 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.246777 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.246843 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.247638 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dab0fb0f3d947efe9367eb4b30fc777ab2c8a308cb11da87dd8cf745d61564c4"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.247734 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://dab0fb0f3d947efe9367eb4b30fc777ab2c8a308cb11da87dd8cf745d61564c4" gracePeriod=600 Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.895892 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="dab0fb0f3d947efe9367eb4b30fc777ab2c8a308cb11da87dd8cf745d61564c4" exitCode=0 Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.895951 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"dab0fb0f3d947efe9367eb4b30fc777ab2c8a308cb11da87dd8cf745d61564c4"} Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.896338 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"b24a42c256c66e693667314faf1192fa4d816bc92003b13635ba6ca267c3a888"} Mar 20 17:27:08 crc kubenswrapper[4803]: I0320 17:27:08.896362 4803 scope.go:117] "RemoveContainer" containerID="45b15f7f930be75ad9338760e525031847e940aad8524904031331b60994207d" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.486262 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8"] Mar 20 17:27:38 crc kubenswrapper[4803]: E0320 17:27:38.486865 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aa173fa-44b8-4141-8238-32c03c99cbbc" containerName="oc" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.486877 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aa173fa-44b8-4141-8238-32c03c99cbbc" containerName="oc" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.486974 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aa173fa-44b8-4141-8238-32c03c99cbbc" containerName="oc" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.487307 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.489922 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.490078 4803 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qgz4h" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.490319 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.491512 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-dt62l"] Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.492319 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-dt62l" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.494790 4803 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-k278c" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.509613 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-dt62l"] Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.522507 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8"] Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.526767 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gq2dl"] Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.527660 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.529637 4803 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-fb9xv" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.539863 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gq2dl"] Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.635879 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f76dn\" (UniqueName: \"kubernetes.io/projected/ea5d9398-4412-4a8c-a015-42ec0733de0c-kube-api-access-f76dn\") pod \"cert-manager-webhook-687f57d79b-gq2dl\" (UID: \"ea5d9398-4412-4a8c-a015-42ec0733de0c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.635951 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4cjd\" (UniqueName: \"kubernetes.io/projected/06da84ce-5bd4-4e75-ac2d-eda831724e58-kube-api-access-z4cjd\") pod \"cert-manager-858654f9db-dt62l\" (UID: \"06da84ce-5bd4-4e75-ac2d-eda831724e58\") " pod="cert-manager/cert-manager-858654f9db-dt62l" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.636132 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8bsc\" (UniqueName: \"kubernetes.io/projected/4d10f699-d096-4cc0-bd7d-1afc806ede10-kube-api-access-f8bsc\") pod \"cert-manager-cainjector-cf98fcc89-wfvp8\" (UID: \"4d10f699-d096-4cc0-bd7d-1afc806ede10\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.738063 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8bsc\" (UniqueName: \"kubernetes.io/projected/4d10f699-d096-4cc0-bd7d-1afc806ede10-kube-api-access-f8bsc\") pod \"cert-manager-cainjector-cf98fcc89-wfvp8\" (UID: \"4d10f699-d096-4cc0-bd7d-1afc806ede10\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.738585 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f76dn\" (UniqueName: \"kubernetes.io/projected/ea5d9398-4412-4a8c-a015-42ec0733de0c-kube-api-access-f76dn\") pod \"cert-manager-webhook-687f57d79b-gq2dl\" (UID: \"ea5d9398-4412-4a8c-a015-42ec0733de0c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.738698 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4cjd\" (UniqueName: \"kubernetes.io/projected/06da84ce-5bd4-4e75-ac2d-eda831724e58-kube-api-access-z4cjd\") pod \"cert-manager-858654f9db-dt62l\" (UID: \"06da84ce-5bd4-4e75-ac2d-eda831724e58\") " pod="cert-manager/cert-manager-858654f9db-dt62l" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.758234 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4cjd\" (UniqueName: \"kubernetes.io/projected/06da84ce-5bd4-4e75-ac2d-eda831724e58-kube-api-access-z4cjd\") pod \"cert-manager-858654f9db-dt62l\" (UID: \"06da84ce-5bd4-4e75-ac2d-eda831724e58\") " pod="cert-manager/cert-manager-858654f9db-dt62l" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.758976 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8bsc\" (UniqueName: \"kubernetes.io/projected/4d10f699-d096-4cc0-bd7d-1afc806ede10-kube-api-access-f8bsc\") pod \"cert-manager-cainjector-cf98fcc89-wfvp8\" (UID: \"4d10f699-d096-4cc0-bd7d-1afc806ede10\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.764243 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f76dn\" (UniqueName: \"kubernetes.io/projected/ea5d9398-4412-4a8c-a015-42ec0733de0c-kube-api-access-f76dn\") pod \"cert-manager-webhook-687f57d79b-gq2dl\" (UID: \"ea5d9398-4412-4a8c-a015-42ec0733de0c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.802314 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.808068 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-dt62l" Mar 20 17:27:38 crc kubenswrapper[4803]: I0320 17:27:38.840653 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" Mar 20 17:27:39 crc kubenswrapper[4803]: I0320 17:27:39.161398 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gq2dl"] Mar 20 17:27:39 crc kubenswrapper[4803]: I0320 17:27:39.335797 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-dt62l"] Mar 20 17:27:39 crc kubenswrapper[4803]: I0320 17:27:39.341896 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8"] Mar 20 17:27:39 crc kubenswrapper[4803]: W0320 17:27:39.351196 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d10f699_d096_4cc0_bd7d_1afc806ede10.slice/crio-839a165b6384fd2b294d96c262bccd7eab3eafca9be64a9979b9d5fe64176aa5 WatchSource:0}: Error finding container 839a165b6384fd2b294d96c262bccd7eab3eafca9be64a9979b9d5fe64176aa5: Status 404 returned error can't find the container with id 839a165b6384fd2b294d96c262bccd7eab3eafca9be64a9979b9d5fe64176aa5 Mar 20 17:27:40 crc kubenswrapper[4803]: I0320 17:27:40.112685 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8" event={"ID":"4d10f699-d096-4cc0-bd7d-1afc806ede10","Type":"ContainerStarted","Data":"839a165b6384fd2b294d96c262bccd7eab3eafca9be64a9979b9d5fe64176aa5"} Mar 20 17:27:40 crc kubenswrapper[4803]: I0320 17:27:40.117762 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-dt62l" event={"ID":"06da84ce-5bd4-4e75-ac2d-eda831724e58","Type":"ContainerStarted","Data":"d90b1ff17650ecaba39bbed1a975c35db35376dec6730fa781c77c30ec7e4098"} Mar 20 17:27:40 crc kubenswrapper[4803]: I0320 17:27:40.119443 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" event={"ID":"ea5d9398-4412-4a8c-a015-42ec0733de0c","Type":"ContainerStarted","Data":"f6a9a84b080a9accf204fc138363e12876f6a53fa9fdccbf59a87b6a17cc33d3"} Mar 20 17:27:43 crc kubenswrapper[4803]: I0320 17:27:43.139853 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" event={"ID":"ea5d9398-4412-4a8c-a015-42ec0733de0c","Type":"ContainerStarted","Data":"ed9cfbf38f6693b9149af6c7c29adf07b163aab9d4ccb593a4656ba938126d88"} Mar 20 17:27:43 crc kubenswrapper[4803]: I0320 17:27:43.140299 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" Mar 20 17:27:44 crc kubenswrapper[4803]: I0320 17:27:44.150102 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-dt62l" event={"ID":"06da84ce-5bd4-4e75-ac2d-eda831724e58","Type":"ContainerStarted","Data":"502e895def36e534d5fbdf5616496784edc295a85cb8239bd50c3f9025c3faaf"} Mar 20 17:27:44 crc kubenswrapper[4803]: I0320 17:27:44.153963 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8" event={"ID":"4d10f699-d096-4cc0-bd7d-1afc806ede10","Type":"ContainerStarted","Data":"5d1845c84c0794c4a2a94b708e5b435a879a12cf100156bdd2be75c99125fc09"} Mar 20 17:27:44 crc kubenswrapper[4803]: I0320 17:27:44.174869 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-dt62l" podStartSLOduration=2.482253674 podStartE2EDuration="6.174847905s" podCreationTimestamp="2026-03-20 17:27:38 +0000 UTC" firstStartedPulling="2026-03-20 17:27:39.342491601 +0000 UTC m=+669.254083701" lastFinishedPulling="2026-03-20 17:27:43.035085862 +0000 UTC m=+672.946677932" observedRunningTime="2026-03-20 17:27:44.169963386 +0000 UTC m=+674.081555486" watchObservedRunningTime="2026-03-20 17:27:44.174847905 +0000 UTC m=+674.086439985" Mar 20 17:27:44 crc kubenswrapper[4803]: I0320 17:27:44.175652 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" podStartSLOduration=3.524353123 podStartE2EDuration="6.175643506s" podCreationTimestamp="2026-03-20 17:27:38 +0000 UTC" firstStartedPulling="2026-03-20 17:27:39.166956307 +0000 UTC m=+669.078548377" lastFinishedPulling="2026-03-20 17:27:41.81824669 +0000 UTC m=+671.729838760" observedRunningTime="2026-03-20 17:27:43.172901718 +0000 UTC m=+673.084493828" watchObservedRunningTime="2026-03-20 17:27:44.175643506 +0000 UTC m=+674.087235586" Mar 20 17:27:44 crc kubenswrapper[4803]: I0320 17:27:44.194619 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-wfvp8" podStartSLOduration=2.572508333 podStartE2EDuration="6.194594358s" podCreationTimestamp="2026-03-20 17:27:38 +0000 UTC" firstStartedPulling="2026-03-20 17:27:39.355184357 +0000 UTC m=+669.266776457" lastFinishedPulling="2026-03-20 17:27:42.977270402 +0000 UTC m=+672.888862482" observedRunningTime="2026-03-20 17:27:44.190420517 +0000 UTC m=+674.102012597" watchObservedRunningTime="2026-03-20 17:27:44.194594358 +0000 UTC m=+674.106186468" Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.627456 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4v5dx"] Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.628760 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovn-controller" containerID="cri-o://255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc" gracePeriod=30 Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.628888 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="nbdb" containerID="cri-o://f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" gracePeriod=30 Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.628993 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="sbdb" containerID="cri-o://ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" gracePeriod=30 Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.629017 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" gracePeriod=30 Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.629111 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kube-rbac-proxy-node" containerID="cri-o://07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" gracePeriod=30 Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.629151 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovn-acl-logging" containerID="cri-o://1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e" gracePeriod=30 Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.629089 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="northd" containerID="cri-o://0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" gracePeriod=30 Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.691976 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovnkube-controller" containerID="cri-o://93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" gracePeriod=30 Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.844087 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-gq2dl" Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.992294 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4v5dx_4326b171-36ab-465f-ba67-a636b36f1f89/ovn-acl-logging/0.log" Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.993073 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4v5dx_4326b171-36ab-465f-ba67-a636b36f1f89/ovn-controller/0.log" Mar 20 17:27:48 crc kubenswrapper[4803]: I0320 17:27:48.993792 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085062 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7vhj\" (UniqueName: \"kubernetes.io/projected/4326b171-36ab-465f-ba67-a636b36f1f89-kube-api-access-s7vhj\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085350 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-ovn\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085495 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-log-socket\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085677 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-netns\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085830 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-openvswitch\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085967 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-netd\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086119 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-config\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086260 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-env-overrides\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086396 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-slash\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086570 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-bin\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086714 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-etc-openvswitch\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086853 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-node-log\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086976 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-kubelet\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087085 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-var-lib-openvswitch\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087224 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4326b171-36ab-465f-ba67-a636b36f1f89-ovn-node-metrics-cert\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087338 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-systemd-units\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087477 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-script-lib\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087623 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-ovn-kubernetes\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087737 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-systemd\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087851 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-var-lib-cni-networks-ovn-kubernetes\") pod \"4326b171-36ab-465f-ba67-a636b36f1f89\" (UID: \"4326b171-36ab-465f-ba67-a636b36f1f89\") " Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085503 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085635 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-log-socket" (OuterVolumeSpecName: "log-socket") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085776 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.085898 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086055 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086466 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-slash" (OuterVolumeSpecName: "host-slash") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086645 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086654 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086706 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086740 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.086909 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-node-log" (OuterVolumeSpecName: "node-log") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087036 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087158 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087419 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087882 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.087927 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089099 4803 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089221 4803 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089325 4803 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089429 4803 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089549 4803 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089627 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089651 4803 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089835 4803 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.089929 4803 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.090025 4803 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.090125 4803 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.090223 4803 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.090315 4803 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.090410 4803 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.090507 4803 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.090636 4803 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4326b171-36ab-465f-ba67-a636b36f1f89-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.090752 4803 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.100999 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4326b171-36ab-465f-ba67-a636b36f1f89-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104393 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mnhnz"] Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104739 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovnkube-controller" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104766 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovnkube-controller" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104783 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="nbdb" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104794 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="nbdb" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104812 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kube-rbac-proxy-node" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104824 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kube-rbac-proxy-node" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104841 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="northd" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104851 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="northd" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104866 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kubecfg-setup" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104877 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kubecfg-setup" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104895 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovn-acl-logging" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104906 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovn-acl-logging" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104921 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovn-controller" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104932 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovn-controller" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104943 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="sbdb" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104954 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="sbdb" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.104970 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.104981 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.105174 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="sbdb" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.105198 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="nbdb" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.105214 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovn-controller" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.105225 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.105238 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovnkube-controller" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.105254 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="kube-rbac-proxy-node" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.105272 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="northd" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.105288 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" containerName="ovn-acl-logging" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.106925 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4326b171-36ab-465f-ba67-a636b36f1f89-kube-api-access-s7vhj" (OuterVolumeSpecName: "kube-api-access-s7vhj") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "kube-api-access-s7vhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.108199 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.114990 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "4326b171-36ab-465f-ba67-a636b36f1f89" (UID: "4326b171-36ab-465f-ba67-a636b36f1f89"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191346 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-etc-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191395 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-node-log\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191419 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovnkube-script-lib\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191442 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-systemd-units\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191457 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-kubelet\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191472 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191489 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191548 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-env-overrides\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191600 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-systemd\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191629 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-cni-bin\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191696 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-run-netns\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191724 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-var-lib-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191742 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-ovn\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191807 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovn-node-metrics-cert\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191833 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-slash\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191851 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djszl\" (UniqueName: \"kubernetes.io/projected/c5ae3307-3764-4492-96ec-50d7d0944cfd-kube-api-access-djszl\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191886 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-cni-netd\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191908 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-log-socket\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191929 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.191951 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovnkube-config\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.192003 4803 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4326b171-36ab-465f-ba67-a636b36f1f89-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.192017 4803 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.192028 4803 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4326b171-36ab-465f-ba67-a636b36f1f89-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.192039 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7vhj\" (UniqueName: \"kubernetes.io/projected/4326b171-36ab-465f-ba67-a636b36f1f89-kube-api-access-s7vhj\") on node \"crc\" DevicePath \"\"" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.193701 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4v5dx_4326b171-36ab-465f-ba67-a636b36f1f89/ovn-acl-logging/0.log" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194164 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4v5dx_4326b171-36ab-465f-ba67-a636b36f1f89/ovn-controller/0.log" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194603 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" exitCode=0 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194625 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" exitCode=0 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194633 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" exitCode=0 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194641 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" exitCode=0 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194647 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" exitCode=0 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194655 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" exitCode=0 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194664 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e" exitCode=143 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194674 4803 generic.go:334] "Generic (PLEG): container finished" podID="4326b171-36ab-465f-ba67-a636b36f1f89" containerID="255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc" exitCode=143 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194682 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194695 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194727 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194744 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194758 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194771 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194775 4803 scope.go:117] "RemoveContainer" containerID="93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194783 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194890 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194900 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194906 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194916 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194927 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194934 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194940 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194947 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194952 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194958 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194963 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194969 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194974 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194981 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194989 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194994 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.194999 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195004 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195009 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195014 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195019 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195024 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195029 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195036 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4v5dx" event={"ID":"4326b171-36ab-465f-ba67-a636b36f1f89","Type":"ContainerDied","Data":"7bf0ad411aece19fe6b688f92a0d4012aa4ba9929b264e80a04aca998b52d8f9"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195044 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195050 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195055 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195060 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195065 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195070 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195075 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195081 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.195086 4803 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.196192 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d8jn6_55c909c3-a57a-4440-9052-48718b1d2dfd/kube-multus/0.log" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.196231 4803 generic.go:334] "Generic (PLEG): container finished" podID="55c909c3-a57a-4440-9052-48718b1d2dfd" containerID="c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff" exitCode=2 Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.196251 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d8jn6" event={"ID":"55c909c3-a57a-4440-9052-48718b1d2dfd","Type":"ContainerDied","Data":"c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff"} Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.196809 4803 scope.go:117] "RemoveContainer" containerID="c72a012ebb31abdf41c8d2d0e2ede7af60d809809e6e97c099a0dc385d562eff" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.216167 4803 scope.go:117] "RemoveContainer" containerID="ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.242301 4803 scope.go:117] "RemoveContainer" containerID="f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.250617 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4v5dx"] Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.255324 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4v5dx"] Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.267548 4803 scope.go:117] "RemoveContainer" containerID="0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.292995 4803 scope.go:117] "RemoveContainer" containerID="8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293567 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-var-lib-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293626 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-ovn\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293661 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-var-lib-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293689 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovn-node-metrics-cert\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293721 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djszl\" (UniqueName: \"kubernetes.io/projected/c5ae3307-3764-4492-96ec-50d7d0944cfd-kube-api-access-djszl\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293753 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-slash\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293824 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-cni-netd\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293860 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-log-socket\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293896 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293931 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovnkube-config\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293975 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-ovn\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.293995 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-etc-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294035 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-node-log\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294073 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovnkube-script-lib\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294113 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-systemd-units\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294143 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-kubelet\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294181 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294212 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-env-overrides\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294240 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294274 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-systemd\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294309 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-cni-bin\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294350 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-run-netns\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294450 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-run-netns\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294495 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-node-log\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.294552 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-slash\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.295115 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-cni-netd\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.295147 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-log-socket\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.295174 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.295620 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovnkube-script-lib\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.295701 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-systemd-units\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.295747 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-kubelet\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.295791 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.296351 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-env-overrides\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.296425 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-run-ovn-kubernetes\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.296472 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-run-systemd\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.296540 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-host-cni-bin\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.296883 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovnkube-config\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.296882 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5ae3307-3764-4492-96ec-50d7d0944cfd-etc-openvswitch\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.299316 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5ae3307-3764-4492-96ec-50d7d0944cfd-ovn-node-metrics-cert\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.316805 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djszl\" (UniqueName: \"kubernetes.io/projected/c5ae3307-3764-4492-96ec-50d7d0944cfd-kube-api-access-djszl\") pod \"ovnkube-node-mnhnz\" (UID: \"c5ae3307-3764-4492-96ec-50d7d0944cfd\") " pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.326115 4803 scope.go:117] "RemoveContainer" containerID="07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.352174 4803 scope.go:117] "RemoveContainer" containerID="1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.367861 4803 scope.go:117] "RemoveContainer" containerID="255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.382272 4803 scope.go:117] "RemoveContainer" containerID="4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.397517 4803 scope.go:117] "RemoveContainer" containerID="93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.397928 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": container with ID starting with 93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc not found: ID does not exist" containerID="93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.398008 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} err="failed to get container status \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": rpc error: code = NotFound desc = could not find container \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": container with ID starting with 93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.398047 4803 scope.go:117] "RemoveContainer" containerID="ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.398361 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": container with ID starting with ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00 not found: ID does not exist" containerID="ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.398393 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} err="failed to get container status \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": rpc error: code = NotFound desc = could not find container \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": container with ID starting with ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.398416 4803 scope.go:117] "RemoveContainer" containerID="f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.398711 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": container with ID starting with f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c not found: ID does not exist" containerID="f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.398740 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} err="failed to get container status \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": rpc error: code = NotFound desc = could not find container \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": container with ID starting with f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.398757 4803 scope.go:117] "RemoveContainer" containerID="0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.399021 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": container with ID starting with 0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869 not found: ID does not exist" containerID="0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.399062 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} err="failed to get container status \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": rpc error: code = NotFound desc = could not find container \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": container with ID starting with 0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.399094 4803 scope.go:117] "RemoveContainer" containerID="8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.399386 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": container with ID starting with 8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a not found: ID does not exist" containerID="8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.399435 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} err="failed to get container status \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": rpc error: code = NotFound desc = could not find container \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": container with ID starting with 8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.399468 4803 scope.go:117] "RemoveContainer" containerID="07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.400095 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": container with ID starting with 07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129 not found: ID does not exist" containerID="07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.400128 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} err="failed to get container status \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": rpc error: code = NotFound desc = could not find container \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": container with ID starting with 07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.400148 4803 scope.go:117] "RemoveContainer" containerID="1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.400395 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": container with ID starting with 1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e not found: ID does not exist" containerID="1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.400430 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} err="failed to get container status \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": rpc error: code = NotFound desc = could not find container \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": container with ID starting with 1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.400453 4803 scope.go:117] "RemoveContainer" containerID="255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.400692 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": container with ID starting with 255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc not found: ID does not exist" containerID="255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.400730 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} err="failed to get container status \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": rpc error: code = NotFound desc = could not find container \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": container with ID starting with 255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.400750 4803 scope.go:117] "RemoveContainer" containerID="4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe" Mar 20 17:27:49 crc kubenswrapper[4803]: E0320 17:27:49.400966 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": container with ID starting with 4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe not found: ID does not exist" containerID="4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.400995 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} err="failed to get container status \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": rpc error: code = NotFound desc = could not find container \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": container with ID starting with 4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.401013 4803 scope.go:117] "RemoveContainer" containerID="93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.401195 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} err="failed to get container status \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": rpc error: code = NotFound desc = could not find container \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": container with ID starting with 93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.401222 4803 scope.go:117] "RemoveContainer" containerID="ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.401419 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} err="failed to get container status \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": rpc error: code = NotFound desc = could not find container \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": container with ID starting with ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.401457 4803 scope.go:117] "RemoveContainer" containerID="f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.401818 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} err="failed to get container status \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": rpc error: code = NotFound desc = could not find container \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": container with ID starting with f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.401843 4803 scope.go:117] "RemoveContainer" containerID="0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.402067 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} err="failed to get container status \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": rpc error: code = NotFound desc = could not find container \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": container with ID starting with 0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.402097 4803 scope.go:117] "RemoveContainer" containerID="8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.402302 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} err="failed to get container status \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": rpc error: code = NotFound desc = could not find container \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": container with ID starting with 8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.402327 4803 scope.go:117] "RemoveContainer" containerID="07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.402554 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} err="failed to get container status \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": rpc error: code = NotFound desc = could not find container \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": container with ID starting with 07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.402584 4803 scope.go:117] "RemoveContainer" containerID="1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.402792 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} err="failed to get container status \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": rpc error: code = NotFound desc = could not find container \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": container with ID starting with 1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.402822 4803 scope.go:117] "RemoveContainer" containerID="255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.403033 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} err="failed to get container status \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": rpc error: code = NotFound desc = could not find container \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": container with ID starting with 255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.403058 4803 scope.go:117] "RemoveContainer" containerID="4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.403266 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} err="failed to get container status \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": rpc error: code = NotFound desc = could not find container \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": container with ID starting with 4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.403289 4803 scope.go:117] "RemoveContainer" containerID="93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.403460 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} err="failed to get container status \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": rpc error: code = NotFound desc = could not find container \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": container with ID starting with 93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.403486 4803 scope.go:117] "RemoveContainer" containerID="ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.403712 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} err="failed to get container status \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": rpc error: code = NotFound desc = could not find container \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": container with ID starting with ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.403740 4803 scope.go:117] "RemoveContainer" containerID="f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.404260 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} err="failed to get container status \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": rpc error: code = NotFound desc = could not find container \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": container with ID starting with f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.404288 4803 scope.go:117] "RemoveContainer" containerID="0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.404557 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} err="failed to get container status \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": rpc error: code = NotFound desc = could not find container \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": container with ID starting with 0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.404594 4803 scope.go:117] "RemoveContainer" containerID="8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.404901 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} err="failed to get container status \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": rpc error: code = NotFound desc = could not find container \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": container with ID starting with 8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.404932 4803 scope.go:117] "RemoveContainer" containerID="07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.405175 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} err="failed to get container status \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": rpc error: code = NotFound desc = could not find container \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": container with ID starting with 07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.405208 4803 scope.go:117] "RemoveContainer" containerID="1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.405435 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} err="failed to get container status \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": rpc error: code = NotFound desc = could not find container \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": container with ID starting with 1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.405468 4803 scope.go:117] "RemoveContainer" containerID="255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.405718 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} err="failed to get container status \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": rpc error: code = NotFound desc = could not find container \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": container with ID starting with 255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.405746 4803 scope.go:117] "RemoveContainer" containerID="4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.405959 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} err="failed to get container status \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": rpc error: code = NotFound desc = could not find container \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": container with ID starting with 4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.405988 4803 scope.go:117] "RemoveContainer" containerID="93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.406191 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} err="failed to get container status \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": rpc error: code = NotFound desc = could not find container \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": container with ID starting with 93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.406214 4803 scope.go:117] "RemoveContainer" containerID="ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.406418 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} err="failed to get container status \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": rpc error: code = NotFound desc = could not find container \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": container with ID starting with ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.406448 4803 scope.go:117] "RemoveContainer" containerID="f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.406666 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} err="failed to get container status \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": rpc error: code = NotFound desc = could not find container \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": container with ID starting with f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.406691 4803 scope.go:117] "RemoveContainer" containerID="0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.406893 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} err="failed to get container status \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": rpc error: code = NotFound desc = could not find container \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": container with ID starting with 0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.406919 4803 scope.go:117] "RemoveContainer" containerID="8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.407095 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} err="failed to get container status \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": rpc error: code = NotFound desc = could not find container \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": container with ID starting with 8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.407123 4803 scope.go:117] "RemoveContainer" containerID="07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.407321 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} err="failed to get container status \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": rpc error: code = NotFound desc = could not find container \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": container with ID starting with 07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.407344 4803 scope.go:117] "RemoveContainer" containerID="1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.407556 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e"} err="failed to get container status \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": rpc error: code = NotFound desc = could not find container \"1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e\": container with ID starting with 1da0fa5044fbb2dec17e7613ff19dc700379332a076fc5b57d59c252c2aa740e not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.407586 4803 scope.go:117] "RemoveContainer" containerID="255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.407810 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc"} err="failed to get container status \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": rpc error: code = NotFound desc = could not find container \"255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc\": container with ID starting with 255b582d61f9a88153bf11327958627c6d7db80909bc5afb6bfe4ebccb4576cc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.407835 4803 scope.go:117] "RemoveContainer" containerID="4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408053 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe"} err="failed to get container status \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": rpc error: code = NotFound desc = could not find container \"4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe\": container with ID starting with 4f42312e5037f8996c18f0a4ffac1dc88cdd9ea309b590bf09d100bc921c03fe not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408077 4803 scope.go:117] "RemoveContainer" containerID="93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408265 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc"} err="failed to get container status \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": rpc error: code = NotFound desc = could not find container \"93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc\": container with ID starting with 93651f51cb6d014e14de7eddeba68658dc87c82ebf983b8b8953643b6ac731bc not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408287 4803 scope.go:117] "RemoveContainer" containerID="ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408508 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00"} err="failed to get container status \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": rpc error: code = NotFound desc = could not find container \"ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00\": container with ID starting with ffa889533aadd2adba168331b1673df856749c5e0552c72d073e9b626fa2ca00 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408556 4803 scope.go:117] "RemoveContainer" containerID="f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408758 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c"} err="failed to get container status \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": rpc error: code = NotFound desc = could not find container \"f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c\": container with ID starting with f5f2a69aa0df7a02c3e8f9739d679f22789175477901d8b314e6ebe67a88812c not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408780 4803 scope.go:117] "RemoveContainer" containerID="0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.408996 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869"} err="failed to get container status \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": rpc error: code = NotFound desc = could not find container \"0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869\": container with ID starting with 0b0940ab9d4e77ae2f2968db80a963ae9360cc99649ea509e6cdc1410eb23869 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.409021 4803 scope.go:117] "RemoveContainer" containerID="8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.409200 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a"} err="failed to get container status \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": rpc error: code = NotFound desc = could not find container \"8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a\": container with ID starting with 8e929ff91b86029311a66dbd8a0d5c4b9a0f9fea2a1a6d41a9355487eb3b946a not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.409220 4803 scope.go:117] "RemoveContainer" containerID="07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.409411 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129"} err="failed to get container status \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": rpc error: code = NotFound desc = could not find container \"07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129\": container with ID starting with 07f0bf1039dd71baf8c6bd67c4b0b8325e27e7922bdb495766e247e04c47b129 not found: ID does not exist" Mar 20 17:27:49 crc kubenswrapper[4803]: I0320 17:27:49.425922 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:49 crc kubenswrapper[4803]: W0320 17:27:49.439772 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5ae3307_3764_4492_96ec_50d7d0944cfd.slice/crio-aaa7e77de81939d6907de680467e91639e6a567cfecd7dd4c996dc53620fa421 WatchSource:0}: Error finding container aaa7e77de81939d6907de680467e91639e6a567cfecd7dd4c996dc53620fa421: Status 404 returned error can't find the container with id aaa7e77de81939d6907de680467e91639e6a567cfecd7dd4c996dc53620fa421 Mar 20 17:27:50 crc kubenswrapper[4803]: I0320 17:27:50.210004 4803 generic.go:334] "Generic (PLEG): container finished" podID="c5ae3307-3764-4492-96ec-50d7d0944cfd" containerID="5fc1823a7edfd864a7e1647176286005428a7aeab4f9cea3f00a38111f2ec8ce" exitCode=0 Mar 20 17:27:50 crc kubenswrapper[4803]: I0320 17:27:50.210091 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerDied","Data":"5fc1823a7edfd864a7e1647176286005428a7aeab4f9cea3f00a38111f2ec8ce"} Mar 20 17:27:50 crc kubenswrapper[4803]: I0320 17:27:50.210508 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"aaa7e77de81939d6907de680467e91639e6a567cfecd7dd4c996dc53620fa421"} Mar 20 17:27:50 crc kubenswrapper[4803]: I0320 17:27:50.216304 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-d8jn6_55c909c3-a57a-4440-9052-48718b1d2dfd/kube-multus/0.log" Mar 20 17:27:50 crc kubenswrapper[4803]: I0320 17:27:50.216428 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-d8jn6" event={"ID":"55c909c3-a57a-4440-9052-48718b1d2dfd","Type":"ContainerStarted","Data":"e6374e92e0cf17d41a3bb0599213eb6a4c05ae67bfb81a5fdb67672efde5b75f"} Mar 20 17:27:50 crc kubenswrapper[4803]: I0320 17:27:50.857937 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4326b171-36ab-465f-ba67-a636b36f1f89" path="/var/lib/kubelet/pods/4326b171-36ab-465f-ba67-a636b36f1f89/volumes" Mar 20 17:27:51 crc kubenswrapper[4803]: I0320 17:27:51.239512 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"fef5594aef8e4e5550377ad52821bc16eb387506b0894bf9f8c09831f257f503"} Mar 20 17:27:51 crc kubenswrapper[4803]: I0320 17:27:51.239624 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"14373be10de7ac541f6418bc31c8bf3695e2d8336ae02d41673e0815775766d6"} Mar 20 17:27:51 crc kubenswrapper[4803]: I0320 17:27:51.239652 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"c56b6e64a2c372b655a967e2ff0f3267bc5e743c56a3f09d19ea0ef36978f668"} Mar 20 17:27:51 crc kubenswrapper[4803]: I0320 17:27:51.239676 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"bb86d57ddc698395745edef55102b928920869c50df8d7896d354a7af771f083"} Mar 20 17:27:51 crc kubenswrapper[4803]: I0320 17:27:51.239698 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"ffe46b1e99237af00ec51cf6622186f74dfa1560b3fac636a5a7c20bd614e753"} Mar 20 17:27:51 crc kubenswrapper[4803]: I0320 17:27:51.239723 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"29e0d9693d6d62dd61d88727cbd33606b8d2341ed35c46e489e8803ff69a0ebc"} Mar 20 17:27:54 crc kubenswrapper[4803]: I0320 17:27:54.265178 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"3ff45e6976fafc200991132c395f7bd24d4537fc68ea94d2af3e6f97f6b2f3cd"} Mar 20 17:27:56 crc kubenswrapper[4803]: I0320 17:27:56.302637 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" event={"ID":"c5ae3307-3764-4492-96ec-50d7d0944cfd","Type":"ContainerStarted","Data":"4759f588405715d369a1747fa0d32a00b36a6d0405e2ab9a0f26bffaf347e114"} Mar 20 17:27:56 crc kubenswrapper[4803]: I0320 17:27:56.303448 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:56 crc kubenswrapper[4803]: I0320 17:27:56.303666 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:56 crc kubenswrapper[4803]: I0320 17:27:56.303849 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:56 crc kubenswrapper[4803]: I0320 17:27:56.351837 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:56 crc kubenswrapper[4803]: I0320 17:27:56.357654 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:27:56 crc kubenswrapper[4803]: I0320 17:27:56.370478 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" podStartSLOduration=7.370420927 podStartE2EDuration="7.370420927s" podCreationTimestamp="2026-03-20 17:27:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:27:56.358332397 +0000 UTC m=+686.269924567" watchObservedRunningTime="2026-03-20 17:27:56.370420927 +0000 UTC m=+686.282038348" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.130501 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567128-mstjx"] Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.131747 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-mstjx" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.137002 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.137193 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.137012 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.140620 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-mstjx"] Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.257893 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4k9l\" (UniqueName: \"kubernetes.io/projected/975f31fa-3161-4c3e-aa64-e1150a9ca108-kube-api-access-m4k9l\") pod \"auto-csr-approver-29567128-mstjx\" (UID: \"975f31fa-3161-4c3e-aa64-e1150a9ca108\") " pod="openshift-infra/auto-csr-approver-29567128-mstjx" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.359709 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4k9l\" (UniqueName: \"kubernetes.io/projected/975f31fa-3161-4c3e-aa64-e1150a9ca108-kube-api-access-m4k9l\") pod \"auto-csr-approver-29567128-mstjx\" (UID: \"975f31fa-3161-4c3e-aa64-e1150a9ca108\") " pod="openshift-infra/auto-csr-approver-29567128-mstjx" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.389369 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4k9l\" (UniqueName: \"kubernetes.io/projected/975f31fa-3161-4c3e-aa64-e1150a9ca108-kube-api-access-m4k9l\") pod \"auto-csr-approver-29567128-mstjx\" (UID: \"975f31fa-3161-4c3e-aa64-e1150a9ca108\") " pod="openshift-infra/auto-csr-approver-29567128-mstjx" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.461133 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-mstjx" Mar 20 17:28:00 crc kubenswrapper[4803]: I0320 17:28:00.943412 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-mstjx"] Mar 20 17:28:00 crc kubenswrapper[4803]: W0320 17:28:00.962747 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod975f31fa_3161_4c3e_aa64_e1150a9ca108.slice/crio-472a369e21b21ede8bff7a70af6fed6a4b860458ad15abe79030ff46c876928d WatchSource:0}: Error finding container 472a369e21b21ede8bff7a70af6fed6a4b860458ad15abe79030ff46c876928d: Status 404 returned error can't find the container with id 472a369e21b21ede8bff7a70af6fed6a4b860458ad15abe79030ff46c876928d Mar 20 17:28:01 crc kubenswrapper[4803]: I0320 17:28:01.333804 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-mstjx" event={"ID":"975f31fa-3161-4c3e-aa64-e1150a9ca108","Type":"ContainerStarted","Data":"472a369e21b21ede8bff7a70af6fed6a4b860458ad15abe79030ff46c876928d"} Mar 20 17:28:03 crc kubenswrapper[4803]: I0320 17:28:03.354227 4803 generic.go:334] "Generic (PLEG): container finished" podID="975f31fa-3161-4c3e-aa64-e1150a9ca108" containerID="755c8f2ba2febb53f0d50b53f01b630e5059c43f04aea417d4b341c0e21ff85f" exitCode=0 Mar 20 17:28:03 crc kubenswrapper[4803]: I0320 17:28:03.354296 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-mstjx" event={"ID":"975f31fa-3161-4c3e-aa64-e1150a9ca108","Type":"ContainerDied","Data":"755c8f2ba2febb53f0d50b53f01b630e5059c43f04aea417d4b341c0e21ff85f"} Mar 20 17:28:04 crc kubenswrapper[4803]: I0320 17:28:04.708328 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-mstjx" Mar 20 17:28:04 crc kubenswrapper[4803]: I0320 17:28:04.822705 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4k9l\" (UniqueName: \"kubernetes.io/projected/975f31fa-3161-4c3e-aa64-e1150a9ca108-kube-api-access-m4k9l\") pod \"975f31fa-3161-4c3e-aa64-e1150a9ca108\" (UID: \"975f31fa-3161-4c3e-aa64-e1150a9ca108\") " Mar 20 17:28:04 crc kubenswrapper[4803]: I0320 17:28:04.830625 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975f31fa-3161-4c3e-aa64-e1150a9ca108-kube-api-access-m4k9l" (OuterVolumeSpecName: "kube-api-access-m4k9l") pod "975f31fa-3161-4c3e-aa64-e1150a9ca108" (UID: "975f31fa-3161-4c3e-aa64-e1150a9ca108"). InnerVolumeSpecName "kube-api-access-m4k9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:28:04 crc kubenswrapper[4803]: I0320 17:28:04.924851 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4k9l\" (UniqueName: \"kubernetes.io/projected/975f31fa-3161-4c3e-aa64-e1150a9ca108-kube-api-access-m4k9l\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:05 crc kubenswrapper[4803]: I0320 17:28:05.369693 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567128-mstjx" event={"ID":"975f31fa-3161-4c3e-aa64-e1150a9ca108","Type":"ContainerDied","Data":"472a369e21b21ede8bff7a70af6fed6a4b860458ad15abe79030ff46c876928d"} Mar 20 17:28:05 crc kubenswrapper[4803]: I0320 17:28:05.369752 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472a369e21b21ede8bff7a70af6fed6a4b860458ad15abe79030ff46c876928d" Mar 20 17:28:05 crc kubenswrapper[4803]: I0320 17:28:05.369822 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567128-mstjx" Mar 20 17:28:05 crc kubenswrapper[4803]: I0320 17:28:05.770561 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-nz79x"] Mar 20 17:28:05 crc kubenswrapper[4803]: I0320 17:28:05.774570 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567122-nz79x"] Mar 20 17:28:06 crc kubenswrapper[4803]: I0320 17:28:06.860478 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c857ab36-26c2-4a5a-8ab3-c92ef681c2ac" path="/var/lib/kubelet/pods/c857ab36-26c2-4a5a-8ab3-c92ef681c2ac/volumes" Mar 20 17:28:10 crc kubenswrapper[4803]: I0320 17:28:10.906125 4803 scope.go:117] "RemoveContainer" containerID="c954e60afe47b8d8245fb3678f7218c2125d7f2a12a3ddf401469e2b3ba49531" Mar 20 17:28:10 crc kubenswrapper[4803]: I0320 17:28:10.949950 4803 scope.go:117] "RemoveContainer" containerID="5e587bbaff39129ebfc02d71b8e2b857d989fd5c0ce4afc73eaf69943e0b3a50" Mar 20 17:28:19 crc kubenswrapper[4803]: I0320 17:28:19.467817 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mnhnz" Mar 20 17:28:27 crc kubenswrapper[4803]: I0320 17:28:27.273177 4803 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.586383 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x"] Mar 20 17:28:29 crc kubenswrapper[4803]: E0320 17:28:29.586709 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="975f31fa-3161-4c3e-aa64-e1150a9ca108" containerName="oc" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.586731 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="975f31fa-3161-4c3e-aa64-e1150a9ca108" containerName="oc" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.586930 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="975f31fa-3161-4c3e-aa64-e1150a9ca108" containerName="oc" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.588184 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.592184 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.602186 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x"] Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.681689 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvqq7\" (UniqueName: \"kubernetes.io/projected/717f35f6-140a-4071-8f5a-ce0e166c79bb-kube-api-access-qvqq7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.682027 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.682097 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.783904 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.784035 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.784714 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.784813 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.784983 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvqq7\" (UniqueName: \"kubernetes.io/projected/717f35f6-140a-4071-8f5a-ce0e166c79bb-kube-api-access-qvqq7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.823152 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvqq7\" (UniqueName: \"kubernetes.io/projected/717f35f6-140a-4071-8f5a-ce0e166c79bb-kube-api-access-qvqq7\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:29 crc kubenswrapper[4803]: I0320 17:28:29.918116 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:30 crc kubenswrapper[4803]: I0320 17:28:30.156147 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x"] Mar 20 17:28:30 crc kubenswrapper[4803]: I0320 17:28:30.557284 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" event={"ID":"717f35f6-140a-4071-8f5a-ce0e166c79bb","Type":"ContainerStarted","Data":"76fb4a80110e393c9b5be875eb64f8cf2e33de225c765fcf84377d78acf602c8"} Mar 20 17:28:30 crc kubenswrapper[4803]: I0320 17:28:30.557375 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" event={"ID":"717f35f6-140a-4071-8f5a-ce0e166c79bb","Type":"ContainerStarted","Data":"a774a54855a0e2545122ffdd074e53bd05bdf00d0462ed75cd4cf22fbd82c973"} Mar 20 17:28:31 crc kubenswrapper[4803]: I0320 17:28:31.566627 4803 generic.go:334] "Generic (PLEG): container finished" podID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerID="76fb4a80110e393c9b5be875eb64f8cf2e33de225c765fcf84377d78acf602c8" exitCode=0 Mar 20 17:28:31 crc kubenswrapper[4803]: I0320 17:28:31.566720 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" event={"ID":"717f35f6-140a-4071-8f5a-ce0e166c79bb","Type":"ContainerDied","Data":"76fb4a80110e393c9b5be875eb64f8cf2e33de225c765fcf84377d78acf602c8"} Mar 20 17:28:31 crc kubenswrapper[4803]: I0320 17:28:31.948851 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v5qk5"] Mar 20 17:28:31 crc kubenswrapper[4803]: I0320 17:28:31.950673 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:31 crc kubenswrapper[4803]: I0320 17:28:31.970273 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5qk5"] Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.015036 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-catalog-content\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.015098 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-strnl\" (UniqueName: \"kubernetes.io/projected/fa2b2d8a-15ef-454c-a743-d3d421fb4262-kube-api-access-strnl\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.015203 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-utilities\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.116242 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-utilities\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.116505 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-strnl\" (UniqueName: \"kubernetes.io/projected/fa2b2d8a-15ef-454c-a743-d3d421fb4262-kube-api-access-strnl\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.116591 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-catalog-content\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.117148 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-utilities\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.117256 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-catalog-content\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.154706 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-strnl\" (UniqueName: \"kubernetes.io/projected/fa2b2d8a-15ef-454c-a743-d3d421fb4262-kube-api-access-strnl\") pod \"redhat-operators-v5qk5\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.278328 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:32 crc kubenswrapper[4803]: I0320 17:28:32.699347 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5qk5"] Mar 20 17:28:33 crc kubenswrapper[4803]: I0320 17:28:33.580870 4803 generic.go:334] "Generic (PLEG): container finished" podID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerID="802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e" exitCode=0 Mar 20 17:28:33 crc kubenswrapper[4803]: I0320 17:28:33.580913 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5qk5" event={"ID":"fa2b2d8a-15ef-454c-a743-d3d421fb4262","Type":"ContainerDied","Data":"802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e"} Mar 20 17:28:33 crc kubenswrapper[4803]: I0320 17:28:33.581268 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5qk5" event={"ID":"fa2b2d8a-15ef-454c-a743-d3d421fb4262","Type":"ContainerStarted","Data":"a3dffe85b9266eb5f5b7042dee395300f9f6692c5607b30f38bf8ae7045ff2d3"} Mar 20 17:28:33 crc kubenswrapper[4803]: I0320 17:28:33.584166 4803 generic.go:334] "Generic (PLEG): container finished" podID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerID="6e885135d24aec486ab1dbbd5738f8cc0678a965d6cfbdbab47530cf0f3e37c7" exitCode=0 Mar 20 17:28:33 crc kubenswrapper[4803]: I0320 17:28:33.584217 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" event={"ID":"717f35f6-140a-4071-8f5a-ce0e166c79bb","Type":"ContainerDied","Data":"6e885135d24aec486ab1dbbd5738f8cc0678a965d6cfbdbab47530cf0f3e37c7"} Mar 20 17:28:34 crc kubenswrapper[4803]: I0320 17:28:34.598514 4803 generic.go:334] "Generic (PLEG): container finished" podID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerID="eedc683e83c6c004b6188b2c6a7f1719f216aca06a337a3d1566505d2d0ec9e1" exitCode=0 Mar 20 17:28:34 crc kubenswrapper[4803]: I0320 17:28:34.598586 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" event={"ID":"717f35f6-140a-4071-8f5a-ce0e166c79bb","Type":"ContainerDied","Data":"eedc683e83c6c004b6188b2c6a7f1719f216aca06a337a3d1566505d2d0ec9e1"} Mar 20 17:28:35 crc kubenswrapper[4803]: I0320 17:28:35.606978 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5qk5" event={"ID":"fa2b2d8a-15ef-454c-a743-d3d421fb4262","Type":"ContainerStarted","Data":"b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110"} Mar 20 17:28:35 crc kubenswrapper[4803]: I0320 17:28:35.868266 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:35 crc kubenswrapper[4803]: I0320 17:28:35.972491 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvqq7\" (UniqueName: \"kubernetes.io/projected/717f35f6-140a-4071-8f5a-ce0e166c79bb-kube-api-access-qvqq7\") pod \"717f35f6-140a-4071-8f5a-ce0e166c79bb\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " Mar 20 17:28:35 crc kubenswrapper[4803]: I0320 17:28:35.972616 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-bundle\") pod \"717f35f6-140a-4071-8f5a-ce0e166c79bb\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " Mar 20 17:28:35 crc kubenswrapper[4803]: I0320 17:28:35.972682 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-util\") pod \"717f35f6-140a-4071-8f5a-ce0e166c79bb\" (UID: \"717f35f6-140a-4071-8f5a-ce0e166c79bb\") " Mar 20 17:28:35 crc kubenswrapper[4803]: I0320 17:28:35.973769 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-bundle" (OuterVolumeSpecName: "bundle") pod "717f35f6-140a-4071-8f5a-ce0e166c79bb" (UID: "717f35f6-140a-4071-8f5a-ce0e166c79bb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:28:35 crc kubenswrapper[4803]: I0320 17:28:35.980205 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717f35f6-140a-4071-8f5a-ce0e166c79bb-kube-api-access-qvqq7" (OuterVolumeSpecName: "kube-api-access-qvqq7") pod "717f35f6-140a-4071-8f5a-ce0e166c79bb" (UID: "717f35f6-140a-4071-8f5a-ce0e166c79bb"). InnerVolumeSpecName "kube-api-access-qvqq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:28:35 crc kubenswrapper[4803]: I0320 17:28:35.985718 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-util" (OuterVolumeSpecName: "util") pod "717f35f6-140a-4071-8f5a-ce0e166c79bb" (UID: "717f35f6-140a-4071-8f5a-ce0e166c79bb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:28:36 crc kubenswrapper[4803]: I0320 17:28:36.074662 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvqq7\" (UniqueName: \"kubernetes.io/projected/717f35f6-140a-4071-8f5a-ce0e166c79bb-kube-api-access-qvqq7\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:36 crc kubenswrapper[4803]: I0320 17:28:36.074702 4803 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:36 crc kubenswrapper[4803]: I0320 17:28:36.074715 4803 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/717f35f6-140a-4071-8f5a-ce0e166c79bb-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:36 crc kubenswrapper[4803]: I0320 17:28:36.618583 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" event={"ID":"717f35f6-140a-4071-8f5a-ce0e166c79bb","Type":"ContainerDied","Data":"a774a54855a0e2545122ffdd074e53bd05bdf00d0462ed75cd4cf22fbd82c973"} Mar 20 17:28:36 crc kubenswrapper[4803]: I0320 17:28:36.618621 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x" Mar 20 17:28:36 crc kubenswrapper[4803]: I0320 17:28:36.618639 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a774a54855a0e2545122ffdd074e53bd05bdf00d0462ed75cd4cf22fbd82c973" Mar 20 17:28:36 crc kubenswrapper[4803]: I0320 17:28:36.623865 4803 generic.go:334] "Generic (PLEG): container finished" podID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerID="b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110" exitCode=0 Mar 20 17:28:36 crc kubenswrapper[4803]: I0320 17:28:36.623952 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5qk5" event={"ID":"fa2b2d8a-15ef-454c-a743-d3d421fb4262","Type":"ContainerDied","Data":"b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110"} Mar 20 17:28:37 crc kubenswrapper[4803]: I0320 17:28:37.635115 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5qk5" event={"ID":"fa2b2d8a-15ef-454c-a743-d3d421fb4262","Type":"ContainerStarted","Data":"19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a"} Mar 20 17:28:37 crc kubenswrapper[4803]: I0320 17:28:37.672653 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v5qk5" podStartSLOduration=2.986519666 podStartE2EDuration="6.672625614s" podCreationTimestamp="2026-03-20 17:28:31 +0000 UTC" firstStartedPulling="2026-03-20 17:28:33.582478667 +0000 UTC m=+723.494070737" lastFinishedPulling="2026-03-20 17:28:37.268584575 +0000 UTC m=+727.180176685" observedRunningTime="2026-03-20 17:28:37.666015579 +0000 UTC m=+727.577607729" watchObservedRunningTime="2026-03-20 17:28:37.672625614 +0000 UTC m=+727.584217724" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.853320 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl"] Mar 20 17:28:39 crc kubenswrapper[4803]: E0320 17:28:39.853909 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerName="util" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.853930 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerName="util" Mar 20 17:28:39 crc kubenswrapper[4803]: E0320 17:28:39.853951 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerName="pull" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.853961 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerName="pull" Mar 20 17:28:39 crc kubenswrapper[4803]: E0320 17:28:39.853975 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerName="extract" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.853988 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerName="extract" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.854168 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="717f35f6-140a-4071-8f5a-ce0e166c79bb" containerName="extract" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.854815 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.857221 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-tnms5" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.858475 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.859297 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.863418 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl"] Mar 20 17:28:39 crc kubenswrapper[4803]: I0320 17:28:39.943616 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvfs\" (UniqueName: \"kubernetes.io/projected/f2fb97b5-40f7-443d-8680-95e112804031-kube-api-access-6tvfs\") pod \"nmstate-operator-796d4cfff4-xgfwl\" (UID: \"f2fb97b5-40f7-443d-8680-95e112804031\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl" Mar 20 17:28:40 crc kubenswrapper[4803]: I0320 17:28:40.044472 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvfs\" (UniqueName: \"kubernetes.io/projected/f2fb97b5-40f7-443d-8680-95e112804031-kube-api-access-6tvfs\") pod \"nmstate-operator-796d4cfff4-xgfwl\" (UID: \"f2fb97b5-40f7-443d-8680-95e112804031\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl" Mar 20 17:28:40 crc kubenswrapper[4803]: I0320 17:28:40.067323 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvfs\" (UniqueName: \"kubernetes.io/projected/f2fb97b5-40f7-443d-8680-95e112804031-kube-api-access-6tvfs\") pod \"nmstate-operator-796d4cfff4-xgfwl\" (UID: \"f2fb97b5-40f7-443d-8680-95e112804031\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl" Mar 20 17:28:40 crc kubenswrapper[4803]: I0320 17:28:40.173497 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl" Mar 20 17:28:40 crc kubenswrapper[4803]: I0320 17:28:40.489444 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl"] Mar 20 17:28:40 crc kubenswrapper[4803]: I0320 17:28:40.653383 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl" event={"ID":"f2fb97b5-40f7-443d-8680-95e112804031","Type":"ContainerStarted","Data":"48523ff8ea83c794ee3d8bb06515728727bae92ff95af43af0860b13a2416257"} Mar 20 17:28:42 crc kubenswrapper[4803]: I0320 17:28:42.279403 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:42 crc kubenswrapper[4803]: I0320 17:28:42.280747 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:43 crc kubenswrapper[4803]: I0320 17:28:43.356201 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v5qk5" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="registry-server" probeResult="failure" output=< Mar 20 17:28:43 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:28:43 crc kubenswrapper[4803]: > Mar 20 17:28:44 crc kubenswrapper[4803]: I0320 17:28:44.691440 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl" event={"ID":"f2fb97b5-40f7-443d-8680-95e112804031","Type":"ContainerStarted","Data":"219c7b34709f3a2213f8897be34387934d25831489dbec366a5be4be2578a412"} Mar 20 17:28:44 crc kubenswrapper[4803]: I0320 17:28:44.721279 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-xgfwl" podStartSLOduration=2.321819016 podStartE2EDuration="5.721254181s" podCreationTimestamp="2026-03-20 17:28:39 +0000 UTC" firstStartedPulling="2026-03-20 17:28:40.496132312 +0000 UTC m=+730.407724382" lastFinishedPulling="2026-03-20 17:28:43.895567477 +0000 UTC m=+733.807159547" observedRunningTime="2026-03-20 17:28:44.717568184 +0000 UTC m=+734.629160254" watchObservedRunningTime="2026-03-20 17:28:44.721254181 +0000 UTC m=+734.632846291" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.721688 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k"] Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.723210 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.725620 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-zbwqh" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.743295 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k"] Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.748659 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-krj4f"] Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.749440 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.751645 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.776991 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-krj4f"] Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.785643 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-stkpl"] Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.786500 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.871341 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0bf484cc-0fe2-4cb0-99c2-0714910012ca-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-krj4f\" (UID: \"0bf484cc-0fe2-4cb0-99c2-0714910012ca\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.871425 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-ovs-socket\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.871458 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-nmstate-lock\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.871494 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd7qg\" (UniqueName: \"kubernetes.io/projected/0bf484cc-0fe2-4cb0-99c2-0714910012ca-kube-api-access-nd7qg\") pod \"nmstate-webhook-5f558f5558-krj4f\" (UID: \"0bf484cc-0fe2-4cb0-99c2-0714910012ca\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.871517 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-dbus-socket\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.871565 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngjr2\" (UniqueName: \"kubernetes.io/projected/8b0bc609-411b-43cc-b7cf-a88f669b2d44-kube-api-access-ngjr2\") pod \"nmstate-metrics-9b8c8685d-fjk5k\" (UID: \"8b0bc609-411b-43cc-b7cf-a88f669b2d44\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.871588 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftvv\" (UniqueName: \"kubernetes.io/projected/7130990e-c3d3-48fc-99a3-31f225ec19ee-kube-api-access-xftvv\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.888204 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb"] Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.888997 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.890406 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-2tp9l" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.891193 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.891488 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.903112 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb"] Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983200 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngjr2\" (UniqueName: \"kubernetes.io/projected/8b0bc609-411b-43cc-b7cf-a88f669b2d44-kube-api-access-ngjr2\") pod \"nmstate-metrics-9b8c8685d-fjk5k\" (UID: \"8b0bc609-411b-43cc-b7cf-a88f669b2d44\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983239 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftvv\" (UniqueName: \"kubernetes.io/projected/7130990e-c3d3-48fc-99a3-31f225ec19ee-kube-api-access-xftvv\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983293 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0bf484cc-0fe2-4cb0-99c2-0714910012ca-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-krj4f\" (UID: \"0bf484cc-0fe2-4cb0-99c2-0714910012ca\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983312 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67wms\" (UniqueName: \"kubernetes.io/projected/38d598c3-b9e5-4404-abb4-da1e9354e157-kube-api-access-67wms\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983336 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/38d598c3-b9e5-4404-abb4-da1e9354e157-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983365 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/38d598c3-b9e5-4404-abb4-da1e9354e157-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983385 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-ovs-socket\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983404 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-nmstate-lock\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983428 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd7qg\" (UniqueName: \"kubernetes.io/projected/0bf484cc-0fe2-4cb0-99c2-0714910012ca-kube-api-access-nd7qg\") pod \"nmstate-webhook-5f558f5558-krj4f\" (UID: \"0bf484cc-0fe2-4cb0-99c2-0714910012ca\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.983446 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-dbus-socket\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: E0320 17:28:49.983747 4803 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 17:28:49 crc kubenswrapper[4803]: E0320 17:28:49.983822 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0bf484cc-0fe2-4cb0-99c2-0714910012ca-tls-key-pair podName:0bf484cc-0fe2-4cb0-99c2-0714910012ca nodeName:}" failed. No retries permitted until 2026-03-20 17:28:50.483797936 +0000 UTC m=+740.395390126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/0bf484cc-0fe2-4cb0-99c2-0714910012ca-tls-key-pair") pod "nmstate-webhook-5f558f5558-krj4f" (UID: "0bf484cc-0fe2-4cb0-99c2-0714910012ca") : secret "openshift-nmstate-webhook" not found Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.984059 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-dbus-socket\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.984118 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-ovs-socket\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:49 crc kubenswrapper[4803]: I0320 17:28:49.990886 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7130990e-c3d3-48fc-99a3-31f225ec19ee-nmstate-lock\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.010584 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngjr2\" (UniqueName: \"kubernetes.io/projected/8b0bc609-411b-43cc-b7cf-a88f669b2d44-kube-api-access-ngjr2\") pod \"nmstate-metrics-9b8c8685d-fjk5k\" (UID: \"8b0bc609-411b-43cc-b7cf-a88f669b2d44\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.012271 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd7qg\" (UniqueName: \"kubernetes.io/projected/0bf484cc-0fe2-4cb0-99c2-0714910012ca-kube-api-access-nd7qg\") pod \"nmstate-webhook-5f558f5558-krj4f\" (UID: \"0bf484cc-0fe2-4cb0-99c2-0714910012ca\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.016235 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftvv\" (UniqueName: \"kubernetes.io/projected/7130990e-c3d3-48fc-99a3-31f225ec19ee-kube-api-access-xftvv\") pod \"nmstate-handler-stkpl\" (UID: \"7130990e-c3d3-48fc-99a3-31f225ec19ee\") " pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.044779 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.087084 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67wms\" (UniqueName: \"kubernetes.io/projected/38d598c3-b9e5-4404-abb4-da1e9354e157-kube-api-access-67wms\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.087139 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/38d598c3-b9e5-4404-abb4-da1e9354e157-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.087183 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/38d598c3-b9e5-4404-abb4-da1e9354e157-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:50 crc kubenswrapper[4803]: E0320 17:28:50.087339 4803 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 20 17:28:50 crc kubenswrapper[4803]: E0320 17:28:50.087392 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38d598c3-b9e5-4404-abb4-da1e9354e157-plugin-serving-cert podName:38d598c3-b9e5-4404-abb4-da1e9354e157 nodeName:}" failed. No retries permitted until 2026-03-20 17:28:50.587375527 +0000 UTC m=+740.498967607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/38d598c3-b9e5-4404-abb4-da1e9354e157-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-2nnmb" (UID: "38d598c3-b9e5-4404-abb4-da1e9354e157") : secret "plugin-serving-cert" not found Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.088719 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/38d598c3-b9e5-4404-abb4-da1e9354e157-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.090844 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8b7dccd6-8txp6"] Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.091468 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.105884 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67wms\" (UniqueName: \"kubernetes.io/projected/38d598c3-b9e5-4404-abb4-da1e9354e157-kube-api-access-67wms\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.108285 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.109422 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8b7dccd6-8txp6"] Mar 20 17:28:50 crc kubenswrapper[4803]: W0320 17:28:50.134828 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7130990e_c3d3_48fc_99a3_31f225ec19ee.slice/crio-1f010b27c7e87eed0e917f426722956844a0386546e1ec9f2ed10d1b5d80ddd9 WatchSource:0}: Error finding container 1f010b27c7e87eed0e917f426722956844a0386546e1ec9f2ed10d1b5d80ddd9: Status 404 returned error can't find the container with id 1f010b27c7e87eed0e917f426722956844a0386546e1ec9f2ed10d1b5d80ddd9 Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.188592 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-oauth-config\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.188648 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-service-ca\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.188709 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-trusted-ca-bundle\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.188730 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-serving-cert\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.188746 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rh9q\" (UniqueName: \"kubernetes.io/projected/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-kube-api-access-5rh9q\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.188771 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-oauth-serving-cert\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.188796 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-config\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.290411 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-oauth-config\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.290481 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-service-ca\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.290586 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-trusted-ca-bundle\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.290618 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-serving-cert\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.290643 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rh9q\" (UniqueName: \"kubernetes.io/projected/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-kube-api-access-5rh9q\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.290675 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-oauth-serving-cert\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.290710 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-config\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.291880 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-config\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.292028 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-oauth-serving-cert\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.292599 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-trusted-ca-bundle\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.294676 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-service-ca\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.294870 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-oauth-config\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.295478 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-console-serving-cert\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.305300 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rh9q\" (UniqueName: \"kubernetes.io/projected/a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1-kube-api-access-5rh9q\") pod \"console-8b7dccd6-8txp6\" (UID: \"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1\") " pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.430088 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.495319 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0bf484cc-0fe2-4cb0-99c2-0714910012ca-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-krj4f\" (UID: \"0bf484cc-0fe2-4cb0-99c2-0714910012ca\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.501654 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0bf484cc-0fe2-4cb0-99c2-0714910012ca-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-krj4f\" (UID: \"0bf484cc-0fe2-4cb0-99c2-0714910012ca\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.504766 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k"] Mar 20 17:28:50 crc kubenswrapper[4803]: W0320 17:28:50.510414 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b0bc609_411b_43cc_b7cf_a88f669b2d44.slice/crio-3d5c4ca87396111c804cc3f097bb43d385fdf0f0913dd6fa2a119fea306cbd9a WatchSource:0}: Error finding container 3d5c4ca87396111c804cc3f097bb43d385fdf0f0913dd6fa2a119fea306cbd9a: Status 404 returned error can't find the container with id 3d5c4ca87396111c804cc3f097bb43d385fdf0f0913dd6fa2a119fea306cbd9a Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.597231 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/38d598c3-b9e5-4404-abb4-da1e9354e157-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.604063 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/38d598c3-b9e5-4404-abb4-da1e9354e157-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2nnmb\" (UID: \"38d598c3-b9e5-4404-abb4-da1e9354e157\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.652021 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8b7dccd6-8txp6"] Mar 20 17:28:50 crc kubenswrapper[4803]: W0320 17:28:50.658012 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1fc6aa2_5ec5_4e13_9bea_90b9cc1468f1.slice/crio-ee65a89d9fe4ddab296623a8e058d0b352ca2e3a67b9b6c46684629d9c4db6d7 WatchSource:0}: Error finding container ee65a89d9fe4ddab296623a8e058d0b352ca2e3a67b9b6c46684629d9c4db6d7: Status 404 returned error can't find the container with id ee65a89d9fe4ddab296623a8e058d0b352ca2e3a67b9b6c46684629d9c4db6d7 Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.667035 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.747507 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b7dccd6-8txp6" event={"ID":"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1","Type":"ContainerStarted","Data":"ee65a89d9fe4ddab296623a8e058d0b352ca2e3a67b9b6c46684629d9c4db6d7"} Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.749759 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-stkpl" event={"ID":"7130990e-c3d3-48fc-99a3-31f225ec19ee","Type":"ContainerStarted","Data":"1f010b27c7e87eed0e917f426722956844a0386546e1ec9f2ed10d1b5d80ddd9"} Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.754396 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" event={"ID":"8b0bc609-411b-43cc-b7cf-a88f669b2d44","Type":"ContainerStarted","Data":"3d5c4ca87396111c804cc3f097bb43d385fdf0f0913dd6fa2a119fea306cbd9a"} Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.802466 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" Mar 20 17:28:50 crc kubenswrapper[4803]: I0320 17:28:50.959043 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-krj4f"] Mar 20 17:28:50 crc kubenswrapper[4803]: W0320 17:28:50.967746 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bf484cc_0fe2_4cb0_99c2_0714910012ca.slice/crio-fb420f38f886bb77c7330705a5635f9a3390c0fd504417af742c14dab40eeab2 WatchSource:0}: Error finding container fb420f38f886bb77c7330705a5635f9a3390c0fd504417af742c14dab40eeab2: Status 404 returned error can't find the container with id fb420f38f886bb77c7330705a5635f9a3390c0fd504417af742c14dab40eeab2 Mar 20 17:28:51 crc kubenswrapper[4803]: I0320 17:28:51.027628 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb"] Mar 20 17:28:51 crc kubenswrapper[4803]: W0320 17:28:51.034765 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38d598c3_b9e5_4404_abb4_da1e9354e157.slice/crio-36c2418c4abcf4c49637f70e04e25021667e9aaa870b379284372f31d457f4d9 WatchSource:0}: Error finding container 36c2418c4abcf4c49637f70e04e25021667e9aaa870b379284372f31d457f4d9: Status 404 returned error can't find the container with id 36c2418c4abcf4c49637f70e04e25021667e9aaa870b379284372f31d457f4d9 Mar 20 17:28:51 crc kubenswrapper[4803]: I0320 17:28:51.761752 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8b7dccd6-8txp6" event={"ID":"a1fc6aa2-5ec5-4e13-9bea-90b9cc1468f1","Type":"ContainerStarted","Data":"6afddc012c063d7ec6bf160a546a469361c848f784b2740aa71dd491eb13f579"} Mar 20 17:28:51 crc kubenswrapper[4803]: I0320 17:28:51.765305 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" event={"ID":"0bf484cc-0fe2-4cb0-99c2-0714910012ca","Type":"ContainerStarted","Data":"fb420f38f886bb77c7330705a5635f9a3390c0fd504417af742c14dab40eeab2"} Mar 20 17:28:51 crc kubenswrapper[4803]: I0320 17:28:51.766507 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" event={"ID":"38d598c3-b9e5-4404-abb4-da1e9354e157","Type":"ContainerStarted","Data":"36c2418c4abcf4c49637f70e04e25021667e9aaa870b379284372f31d457f4d9"} Mar 20 17:28:51 crc kubenswrapper[4803]: I0320 17:28:51.830657 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8b7dccd6-8txp6" podStartSLOduration=1.830635525 podStartE2EDuration="1.830635525s" podCreationTimestamp="2026-03-20 17:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:28:51.825308564 +0000 UTC m=+741.736900644" watchObservedRunningTime="2026-03-20 17:28:51.830635525 +0000 UTC m=+741.742227605" Mar 20 17:28:52 crc kubenswrapper[4803]: I0320 17:28:52.333114 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:52 crc kubenswrapper[4803]: I0320 17:28:52.371308 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:52 crc kubenswrapper[4803]: I0320 17:28:52.561509 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v5qk5"] Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.780879 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" event={"ID":"38d598c3-b9e5-4404-abb4-da1e9354e157","Type":"ContainerStarted","Data":"ccda53d39a63f7722761a41eae25e17c2d65dce5229bd96aa51e6fbaee519cdd"} Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.782725 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" event={"ID":"0bf484cc-0fe2-4cb0-99c2-0714910012ca","Type":"ContainerStarted","Data":"f291dc3cf8a391b753f9f5f86c270c37ecb337422d3c59b50021590b76e1f94e"} Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.783465 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.785276 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-stkpl" event={"ID":"7130990e-c3d3-48fc-99a3-31f225ec19ee","Type":"ContainerStarted","Data":"f86bb82d062467b3bd3f892d4e177def0cbe091dcd4a853cbf848dc6481a9928"} Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.785360 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.786565 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" event={"ID":"8b0bc609-411b-43cc-b7cf-a88f669b2d44","Type":"ContainerStarted","Data":"9dfcf7c534f4eeab0487badec9a0ac11324db24cbc2f7100d632f1a77534479f"} Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.786588 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v5qk5" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="registry-server" containerID="cri-o://19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a" gracePeriod=2 Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.803690 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2nnmb" podStartSLOduration=2.240744729 podStartE2EDuration="4.803667804s" podCreationTimestamp="2026-03-20 17:28:49 +0000 UTC" firstStartedPulling="2026-03-20 17:28:51.037221695 +0000 UTC m=+740.948813765" lastFinishedPulling="2026-03-20 17:28:53.60014473 +0000 UTC m=+743.511736840" observedRunningTime="2026-03-20 17:28:53.801435175 +0000 UTC m=+743.713027265" watchObservedRunningTime="2026-03-20 17:28:53.803667804 +0000 UTC m=+743.715259894" Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.837328 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" podStartSLOduration=3.182925975 podStartE2EDuration="4.837309164s" podCreationTimestamp="2026-03-20 17:28:49 +0000 UTC" firstStartedPulling="2026-03-20 17:28:50.972192324 +0000 UTC m=+740.883784394" lastFinishedPulling="2026-03-20 17:28:52.626575473 +0000 UTC m=+742.538167583" observedRunningTime="2026-03-20 17:28:53.828025738 +0000 UTC m=+743.739617828" watchObservedRunningTime="2026-03-20 17:28:53.837309164 +0000 UTC m=+743.748901244" Mar 20 17:28:53 crc kubenswrapper[4803]: I0320 17:28:53.859124 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-stkpl" podStartSLOduration=2.424089131 podStartE2EDuration="4.859106601s" podCreationTimestamp="2026-03-20 17:28:49 +0000 UTC" firstStartedPulling="2026-03-20 17:28:50.143973164 +0000 UTC m=+740.055565234" lastFinishedPulling="2026-03-20 17:28:52.578990634 +0000 UTC m=+742.490582704" observedRunningTime="2026-03-20 17:28:53.855481385 +0000 UTC m=+743.767073455" watchObservedRunningTime="2026-03-20 17:28:53.859106601 +0000 UTC m=+743.770698671" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.271774 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.353615 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-strnl\" (UniqueName: \"kubernetes.io/projected/fa2b2d8a-15ef-454c-a743-d3d421fb4262-kube-api-access-strnl\") pod \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.353701 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-utilities\") pod \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.353791 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-catalog-content\") pod \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\" (UID: \"fa2b2d8a-15ef-454c-a743-d3d421fb4262\") " Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.355491 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-utilities" (OuterVolumeSpecName: "utilities") pod "fa2b2d8a-15ef-454c-a743-d3d421fb4262" (UID: "fa2b2d8a-15ef-454c-a743-d3d421fb4262"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.360725 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2b2d8a-15ef-454c-a743-d3d421fb4262-kube-api-access-strnl" (OuterVolumeSpecName: "kube-api-access-strnl") pod "fa2b2d8a-15ef-454c-a743-d3d421fb4262" (UID: "fa2b2d8a-15ef-454c-a743-d3d421fb4262"). InnerVolumeSpecName "kube-api-access-strnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.455641 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-strnl\" (UniqueName: \"kubernetes.io/projected/fa2b2d8a-15ef-454c-a743-d3d421fb4262-kube-api-access-strnl\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.455680 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.541467 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa2b2d8a-15ef-454c-a743-d3d421fb4262" (UID: "fa2b2d8a-15ef-454c-a743-d3d421fb4262"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.557673 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa2b2d8a-15ef-454c-a743-d3d421fb4262-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.796920 4803 generic.go:334] "Generic (PLEG): container finished" podID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerID="19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a" exitCode=0 Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.796985 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5qk5" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.797032 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5qk5" event={"ID":"fa2b2d8a-15ef-454c-a743-d3d421fb4262","Type":"ContainerDied","Data":"19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a"} Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.797068 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5qk5" event={"ID":"fa2b2d8a-15ef-454c-a743-d3d421fb4262","Type":"ContainerDied","Data":"a3dffe85b9266eb5f5b7042dee395300f9f6692c5607b30f38bf8ae7045ff2d3"} Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.797094 4803 scope.go:117] "RemoveContainer" containerID="19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.813750 4803 scope.go:117] "RemoveContainer" containerID="b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.858477 4803 scope.go:117] "RemoveContainer" containerID="802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.859399 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v5qk5"] Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.865696 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v5qk5"] Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.871933 4803 scope.go:117] "RemoveContainer" containerID="19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a" Mar 20 17:28:54 crc kubenswrapper[4803]: E0320 17:28:54.872366 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a\": container with ID starting with 19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a not found: ID does not exist" containerID="19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.872423 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a"} err="failed to get container status \"19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a\": rpc error: code = NotFound desc = could not find container \"19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a\": container with ID starting with 19bb70cc976ec1802d2e0c1dfc21997b318c4c589cd2630b8dd3f648752b4a8a not found: ID does not exist" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.872466 4803 scope.go:117] "RemoveContainer" containerID="b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110" Mar 20 17:28:54 crc kubenswrapper[4803]: E0320 17:28:54.872820 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110\": container with ID starting with b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110 not found: ID does not exist" containerID="b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.872867 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110"} err="failed to get container status \"b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110\": rpc error: code = NotFound desc = could not find container \"b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110\": container with ID starting with b569f725bf51e0c1246d2b09c98e6effe9ebcc07358207f153ce6c4987df1110 not found: ID does not exist" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.872901 4803 scope.go:117] "RemoveContainer" containerID="802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e" Mar 20 17:28:54 crc kubenswrapper[4803]: E0320 17:28:54.873163 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e\": container with ID starting with 802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e not found: ID does not exist" containerID="802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e" Mar 20 17:28:54 crc kubenswrapper[4803]: I0320 17:28:54.873201 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e"} err="failed to get container status \"802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e\": rpc error: code = NotFound desc = could not find container \"802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e\": container with ID starting with 802ae76682ab01c7b5dc46bd7d3f693aafc93815476bb8ed2a5603375f2e1f5e not found: ID does not exist" Mar 20 17:28:56 crc kubenswrapper[4803]: I0320 17:28:56.871579 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" path="/var/lib/kubelet/pods/fa2b2d8a-15ef-454c-a743-d3d421fb4262/volumes" Mar 20 17:28:57 crc kubenswrapper[4803]: I0320 17:28:57.833491 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" event={"ID":"8b0bc609-411b-43cc-b7cf-a88f669b2d44","Type":"ContainerStarted","Data":"f2ada5187d778d69d41a2f21f6d3dd673cd933c89a2be49e0378b4573343783e"} Mar 20 17:28:57 crc kubenswrapper[4803]: I0320 17:28:57.861810 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-fjk5k" podStartSLOduration=2.738929769 podStartE2EDuration="8.861782414s" podCreationTimestamp="2026-03-20 17:28:49 +0000 UTC" firstStartedPulling="2026-03-20 17:28:50.515290397 +0000 UTC m=+740.426882467" lastFinishedPulling="2026-03-20 17:28:56.638143022 +0000 UTC m=+746.549735112" observedRunningTime="2026-03-20 17:28:57.857949922 +0000 UTC m=+747.769542052" watchObservedRunningTime="2026-03-20 17:28:57.861782414 +0000 UTC m=+747.773374514" Mar 20 17:29:00 crc kubenswrapper[4803]: I0320 17:29:00.142634 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-stkpl" Mar 20 17:29:00 crc kubenswrapper[4803]: I0320 17:29:00.430714 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:29:00 crc kubenswrapper[4803]: I0320 17:29:00.431492 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:29:00 crc kubenswrapper[4803]: I0320 17:29:00.440478 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:29:00 crc kubenswrapper[4803]: I0320 17:29:00.901462 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8b7dccd6-8txp6" Mar 20 17:29:01 crc kubenswrapper[4803]: I0320 17:29:01.002817 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-t6lj8"] Mar 20 17:29:08 crc kubenswrapper[4803]: I0320 17:29:08.245691 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:29:08 crc kubenswrapper[4803]: I0320 17:29:08.246391 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:29:10 crc kubenswrapper[4803]: I0320 17:29:10.676762 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-krj4f" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.764834 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc"] Mar 20 17:29:24 crc kubenswrapper[4803]: E0320 17:29:24.765976 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="extract-utilities" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.765999 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="extract-utilities" Mar 20 17:29:24 crc kubenswrapper[4803]: E0320 17:29:24.766024 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="registry-server" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.766037 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="registry-server" Mar 20 17:29:24 crc kubenswrapper[4803]: E0320 17:29:24.766054 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="extract-content" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.766068 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="extract-content" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.766293 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2b2d8a-15ef-454c-a743-d3d421fb4262" containerName="registry-server" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.767716 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.770466 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.779062 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc"] Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.839205 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5j67\" (UniqueName: \"kubernetes.io/projected/48f49a52-830d-4717-820b-9c214238244d-kube-api-access-t5j67\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.839316 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.839461 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.940812 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.940967 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5j67\" (UniqueName: \"kubernetes.io/projected/48f49a52-830d-4717-820b-9c214238244d-kube-api-access-t5j67\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.941236 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.941910 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.942772 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:24 crc kubenswrapper[4803]: I0320 17:29:24.975702 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5j67\" (UniqueName: \"kubernetes.io/projected/48f49a52-830d-4717-820b-9c214238244d-kube-api-access-t5j67\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:25 crc kubenswrapper[4803]: I0320 17:29:25.131158 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:25 crc kubenswrapper[4803]: I0320 17:29:25.448060 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc"] Mar 20 17:29:25 crc kubenswrapper[4803]: W0320 17:29:25.460114 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f49a52_830d_4717_820b_9c214238244d.slice/crio-fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040 WatchSource:0}: Error finding container fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040: Status 404 returned error can't find the container with id fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040 Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.064705 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-t6lj8" podUID="2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" containerName="console" containerID="cri-o://bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b" gracePeriod=15 Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.100819 4803 generic.go:334] "Generic (PLEG): container finished" podID="48f49a52-830d-4717-820b-9c214238244d" containerID="aca13c5317970dc3ad6f887e612a0335434afe7c309ca1da383338c8f6b663a8" exitCode=0 Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.100967 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" event={"ID":"48f49a52-830d-4717-820b-9c214238244d","Type":"ContainerDied","Data":"aca13c5317970dc3ad6f887e612a0335434afe7c309ca1da383338c8f6b663a8"} Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.101022 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" event={"ID":"48f49a52-830d-4717-820b-9c214238244d","Type":"ContainerStarted","Data":"fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040"} Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.104002 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.506506 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-t6lj8_2e41f10c-132c-4dcc-a6a8-7bac9cd48e52/console/0.log" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.506581 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.565785 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-service-ca\") pod \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.565877 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js4nl\" (UniqueName: \"kubernetes.io/projected/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-kube-api-access-js4nl\") pod \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.565990 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-serving-cert\") pod \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.566040 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-oauth-serving-cert\") pod \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.566096 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-oauth-config\") pod \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.566205 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-config\") pod \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.566244 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-trusted-ca-bundle\") pod \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\" (UID: \"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52\") " Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.566585 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-service-ca" (OuterVolumeSpecName: "service-ca") pod "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" (UID: "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.566775 4803 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.566792 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" (UID: "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.566806 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-config" (OuterVolumeSpecName: "console-config") pod "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" (UID: "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.567472 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" (UID: "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.572087 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-kube-api-access-js4nl" (OuterVolumeSpecName: "kube-api-access-js4nl") pod "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" (UID: "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52"). InnerVolumeSpecName "kube-api-access-js4nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.573565 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" (UID: "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.573833 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" (UID: "2e41f10c-132c-4dcc-a6a8-7bac9cd48e52"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.668792 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js4nl\" (UniqueName: \"kubernetes.io/projected/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-kube-api-access-js4nl\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.668857 4803 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.668876 4803 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.668893 4803 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.668910 4803 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:26 crc kubenswrapper[4803]: I0320 17:29:26.668926 4803 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.110942 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-t6lj8_2e41f10c-132c-4dcc-a6a8-7bac9cd48e52/console/0.log" Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.111001 4803 generic.go:334] "Generic (PLEG): container finished" podID="2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" containerID="bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b" exitCode=2 Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.111036 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t6lj8" event={"ID":"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52","Type":"ContainerDied","Data":"bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b"} Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.111064 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-t6lj8" event={"ID":"2e41f10c-132c-4dcc-a6a8-7bac9cd48e52","Type":"ContainerDied","Data":"3c135f9804508c758fc21f2655615ced914b20fd28b1d8d439f3715a72cdae66"} Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.111088 4803 scope.go:117] "RemoveContainer" containerID="bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b" Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.111151 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-t6lj8" Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.134392 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-t6lj8"] Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.140331 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-t6lj8"] Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.147494 4803 scope.go:117] "RemoveContainer" containerID="bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b" Mar 20 17:29:27 crc kubenswrapper[4803]: E0320 17:29:27.148284 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b\": container with ID starting with bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b not found: ID does not exist" containerID="bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b" Mar 20 17:29:27 crc kubenswrapper[4803]: I0320 17:29:27.148337 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b"} err="failed to get container status \"bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b\": rpc error: code = NotFound desc = could not find container \"bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b\": container with ID starting with bd12baae855f8cbdb309ccacd59f0aa779a1a8331d7eda2503a5ac088a41663b not found: ID does not exist" Mar 20 17:29:28 crc kubenswrapper[4803]: I0320 17:29:28.121275 4803 generic.go:334] "Generic (PLEG): container finished" podID="48f49a52-830d-4717-820b-9c214238244d" containerID="fe5428bf376575e86096457c17f8aa28b632060cb851814a308e8f414b443165" exitCode=0 Mar 20 17:29:28 crc kubenswrapper[4803]: I0320 17:29:28.121335 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" event={"ID":"48f49a52-830d-4717-820b-9c214238244d","Type":"ContainerDied","Data":"fe5428bf376575e86096457c17f8aa28b632060cb851814a308e8f414b443165"} Mar 20 17:29:28 crc kubenswrapper[4803]: I0320 17:29:28.860049 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" path="/var/lib/kubelet/pods/2e41f10c-132c-4dcc-a6a8-7bac9cd48e52/volumes" Mar 20 17:29:29 crc kubenswrapper[4803]: I0320 17:29:29.138143 4803 generic.go:334] "Generic (PLEG): container finished" podID="48f49a52-830d-4717-820b-9c214238244d" containerID="343a35114781bc7813fd2eada061dbddba85c81979c18e1c97db0eaabc83c856" exitCode=0 Mar 20 17:29:29 crc kubenswrapper[4803]: I0320 17:29:29.138255 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" event={"ID":"48f49a52-830d-4717-820b-9c214238244d","Type":"ContainerDied","Data":"343a35114781bc7813fd2eada061dbddba85c81979c18e1c97db0eaabc83c856"} Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.472462 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.521496 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5j67\" (UniqueName: \"kubernetes.io/projected/48f49a52-830d-4717-820b-9c214238244d-kube-api-access-t5j67\") pod \"48f49a52-830d-4717-820b-9c214238244d\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.521625 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-util\") pod \"48f49a52-830d-4717-820b-9c214238244d\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.521738 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-bundle\") pod \"48f49a52-830d-4717-820b-9c214238244d\" (UID: \"48f49a52-830d-4717-820b-9c214238244d\") " Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.523093 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-bundle" (OuterVolumeSpecName: "bundle") pod "48f49a52-830d-4717-820b-9c214238244d" (UID: "48f49a52-830d-4717-820b-9c214238244d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.528165 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f49a52-830d-4717-820b-9c214238244d-kube-api-access-t5j67" (OuterVolumeSpecName: "kube-api-access-t5j67") pod "48f49a52-830d-4717-820b-9c214238244d" (UID: "48f49a52-830d-4717-820b-9c214238244d"). InnerVolumeSpecName "kube-api-access-t5j67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.551110 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-util" (OuterVolumeSpecName: "util") pod "48f49a52-830d-4717-820b-9c214238244d" (UID: "48f49a52-830d-4717-820b-9c214238244d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.623671 4803 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.623728 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5j67\" (UniqueName: \"kubernetes.io/projected/48f49a52-830d-4717-820b-9c214238244d-kube-api-access-t5j67\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:30 crc kubenswrapper[4803]: I0320 17:29:30.623750 4803 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/48f49a52-830d-4717-820b-9c214238244d-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:29:31 crc kubenswrapper[4803]: I0320 17:29:31.157649 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" event={"ID":"48f49a52-830d-4717-820b-9c214238244d","Type":"ContainerDied","Data":"fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040"} Mar 20 17:29:31 crc kubenswrapper[4803]: I0320 17:29:31.157721 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040" Mar 20 17:29:31 crc kubenswrapper[4803]: I0320 17:29:31.158205 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc" Mar 20 17:29:31 crc kubenswrapper[4803]: E0320 17:29:31.499006 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f49a52_830d_4717_820b_9c214238244d.slice/crio-fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040\": RecentStats: unable to find data in memory cache]" Mar 20 17:29:38 crc kubenswrapper[4803]: I0320 17:29:38.246764 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:29:38 crc kubenswrapper[4803]: I0320 17:29:38.247718 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.770062 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h"] Mar 20 17:29:39 crc kubenswrapper[4803]: E0320 17:29:39.770260 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f49a52-830d-4717-820b-9c214238244d" containerName="extract" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.770272 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f49a52-830d-4717-820b-9c214238244d" containerName="extract" Mar 20 17:29:39 crc kubenswrapper[4803]: E0320 17:29:39.770282 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f49a52-830d-4717-820b-9c214238244d" containerName="pull" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.770287 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f49a52-830d-4717-820b-9c214238244d" containerName="pull" Mar 20 17:29:39 crc kubenswrapper[4803]: E0320 17:29:39.770299 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f49a52-830d-4717-820b-9c214238244d" containerName="util" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.770307 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f49a52-830d-4717-820b-9c214238244d" containerName="util" Mar 20 17:29:39 crc kubenswrapper[4803]: E0320 17:29:39.770316 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" containerName="console" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.770322 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" containerName="console" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.770416 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f49a52-830d-4717-820b-9c214238244d" containerName="extract" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.770425 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e41f10c-132c-4dcc-a6a8-7bac9cd48e52" containerName="console" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.770784 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.774488 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.774819 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.775179 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.776653 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9tmv2" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.777069 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.796756 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h"] Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.868104 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/656f1985-be0a-4447-a03f-2ec4d11727c2-webhook-cert\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.868174 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/656f1985-be0a-4447-a03f-2ec4d11727c2-apiservice-cert\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.868216 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlwrz\" (UniqueName: \"kubernetes.io/projected/656f1985-be0a-4447-a03f-2ec4d11727c2-kube-api-access-hlwrz\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.968954 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/656f1985-be0a-4447-a03f-2ec4d11727c2-webhook-cert\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.969006 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/656f1985-be0a-4447-a03f-2ec4d11727c2-apiservice-cert\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.969038 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlwrz\" (UniqueName: \"kubernetes.io/projected/656f1985-be0a-4447-a03f-2ec4d11727c2-kube-api-access-hlwrz\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.975359 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/656f1985-be0a-4447-a03f-2ec4d11727c2-webhook-cert\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.987093 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/656f1985-be0a-4447-a03f-2ec4d11727c2-apiservice-cert\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:39 crc kubenswrapper[4803]: I0320 17:29:39.988006 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlwrz\" (UniqueName: \"kubernetes.io/projected/656f1985-be0a-4447-a03f-2ec4d11727c2-kube-api-access-hlwrz\") pod \"metallb-operator-controller-manager-599d9f9c9-jbh6h\" (UID: \"656f1985-be0a-4447-a03f-2ec4d11727c2\") " pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.086278 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.112027 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn"] Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.112693 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.114962 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.115451 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-qnmbn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.116805 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.124616 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn"] Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.171307 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92msv\" (UniqueName: \"kubernetes.io/projected/f939090a-abec-48ac-9e06-10175ff02c71-kube-api-access-92msv\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.171370 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f939090a-abec-48ac-9e06-10175ff02c71-webhook-cert\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.171399 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f939090a-abec-48ac-9e06-10175ff02c71-apiservice-cert\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.272429 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92msv\" (UniqueName: \"kubernetes.io/projected/f939090a-abec-48ac-9e06-10175ff02c71-kube-api-access-92msv\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.272473 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f939090a-abec-48ac-9e06-10175ff02c71-webhook-cert\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.272495 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f939090a-abec-48ac-9e06-10175ff02c71-apiservice-cert\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.275896 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f939090a-abec-48ac-9e06-10175ff02c71-apiservice-cert\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.275990 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f939090a-abec-48ac-9e06-10175ff02c71-webhook-cert\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.293512 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92msv\" (UniqueName: \"kubernetes.io/projected/f939090a-abec-48ac-9e06-10175ff02c71-kube-api-access-92msv\") pod \"metallb-operator-webhook-server-66cfbc7d76-5tcqn\" (UID: \"f939090a-abec-48ac-9e06-10175ff02c71\") " pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.331817 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h"] Mar 20 17:29:40 crc kubenswrapper[4803]: W0320 17:29:40.341750 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod656f1985_be0a_4447_a03f_2ec4d11727c2.slice/crio-0a57dbf1aaacdf5abf736adf3a8225e82af868c0ad98d97588fe5c66ef53de2f WatchSource:0}: Error finding container 0a57dbf1aaacdf5abf736adf3a8225e82af868c0ad98d97588fe5c66ef53de2f: Status 404 returned error can't find the container with id 0a57dbf1aaacdf5abf736adf3a8225e82af868c0ad98d97588fe5c66ef53de2f Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.458990 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:40 crc kubenswrapper[4803]: I0320 17:29:40.693690 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn"] Mar 20 17:29:40 crc kubenswrapper[4803]: W0320 17:29:40.701137 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf939090a_abec_48ac_9e06_10175ff02c71.slice/crio-53d999d7f7dab276cf3a677b151443228cdb90ca84ce16de8c092cac30f632a8 WatchSource:0}: Error finding container 53d999d7f7dab276cf3a677b151443228cdb90ca84ce16de8c092cac30f632a8: Status 404 returned error can't find the container with id 53d999d7f7dab276cf3a677b151443228cdb90ca84ce16de8c092cac30f632a8 Mar 20 17:29:41 crc kubenswrapper[4803]: I0320 17:29:41.214182 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" event={"ID":"656f1985-be0a-4447-a03f-2ec4d11727c2","Type":"ContainerStarted","Data":"0a57dbf1aaacdf5abf736adf3a8225e82af868c0ad98d97588fe5c66ef53de2f"} Mar 20 17:29:41 crc kubenswrapper[4803]: I0320 17:29:41.216092 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" event={"ID":"f939090a-abec-48ac-9e06-10175ff02c71","Type":"ContainerStarted","Data":"53d999d7f7dab276cf3a677b151443228cdb90ca84ce16de8c092cac30f632a8"} Mar 20 17:29:41 crc kubenswrapper[4803]: E0320 17:29:41.664448 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f49a52_830d_4717_820b_9c214238244d.slice/crio-fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040\": RecentStats: unable to find data in memory cache]" Mar 20 17:29:44 crc kubenswrapper[4803]: I0320 17:29:44.268991 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" event={"ID":"656f1985-be0a-4447-a03f-2ec4d11727c2","Type":"ContainerStarted","Data":"df26fef2fc3abb799766956a4de5bb988090da35eb6a4f9449511d7b121ea003"} Mar 20 17:29:44 crc kubenswrapper[4803]: I0320 17:29:44.269504 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:29:44 crc kubenswrapper[4803]: I0320 17:29:44.327830 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" podStartSLOduration=2.373256443 podStartE2EDuration="5.32781379s" podCreationTimestamp="2026-03-20 17:29:39 +0000 UTC" firstStartedPulling="2026-03-20 17:29:40.343423896 +0000 UTC m=+790.255015967" lastFinishedPulling="2026-03-20 17:29:43.297981244 +0000 UTC m=+793.209573314" observedRunningTime="2026-03-20 17:29:44.324989457 +0000 UTC m=+794.236581537" watchObservedRunningTime="2026-03-20 17:29:44.32781379 +0000 UTC m=+794.239405850" Mar 20 17:29:45 crc kubenswrapper[4803]: I0320 17:29:45.279315 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" event={"ID":"f939090a-abec-48ac-9e06-10175ff02c71","Type":"ContainerStarted","Data":"221fac48ea38a24a5673729c2cf5b60751ff0c46fe699f0eb59dcffe09e18ec2"} Mar 20 17:29:45 crc kubenswrapper[4803]: I0320 17:29:45.308864 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" podStartSLOduration=1.109768866 podStartE2EDuration="5.308841718s" podCreationTimestamp="2026-03-20 17:29:40 +0000 UTC" firstStartedPulling="2026-03-20 17:29:40.703993334 +0000 UTC m=+790.615585414" lastFinishedPulling="2026-03-20 17:29:44.903066196 +0000 UTC m=+794.814658266" observedRunningTime="2026-03-20 17:29:45.30241686 +0000 UTC m=+795.214008970" watchObservedRunningTime="2026-03-20 17:29:45.308841718 +0000 UTC m=+795.220433828" Mar 20 17:29:46 crc kubenswrapper[4803]: I0320 17:29:46.286498 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:29:51 crc kubenswrapper[4803]: E0320 17:29:51.789202 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f49a52_830d_4717_820b_9c214238244d.slice/crio-fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040\": RecentStats: unable to find data in memory cache]" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.132756 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567130-fhsx6"] Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.134186 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-fhsx6" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.139894 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.140523 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.140749 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.146048 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2"] Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.150063 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.153490 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.157666 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-fhsx6"] Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.159094 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.175925 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2"] Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.319297 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df2e7069-f589-4f29-b6aa-c4622603e334-config-volume\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.319362 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ptv9\" (UniqueName: \"kubernetes.io/projected/df2e7069-f589-4f29-b6aa-c4622603e334-kube-api-access-2ptv9\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.319393 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df2e7069-f589-4f29-b6aa-c4622603e334-secret-volume\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.319435 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mn7f\" (UniqueName: \"kubernetes.io/projected/96522cae-d723-456d-be53-43bcdcda0146-kube-api-access-8mn7f\") pod \"auto-csr-approver-29567130-fhsx6\" (UID: \"96522cae-d723-456d-be53-43bcdcda0146\") " pod="openshift-infra/auto-csr-approver-29567130-fhsx6" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.419947 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df2e7069-f589-4f29-b6aa-c4622603e334-config-volume\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.419998 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ptv9\" (UniqueName: \"kubernetes.io/projected/df2e7069-f589-4f29-b6aa-c4622603e334-kube-api-access-2ptv9\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.420018 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df2e7069-f589-4f29-b6aa-c4622603e334-secret-volume\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.420046 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mn7f\" (UniqueName: \"kubernetes.io/projected/96522cae-d723-456d-be53-43bcdcda0146-kube-api-access-8mn7f\") pod \"auto-csr-approver-29567130-fhsx6\" (UID: \"96522cae-d723-456d-be53-43bcdcda0146\") " pod="openshift-infra/auto-csr-approver-29567130-fhsx6" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.421734 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df2e7069-f589-4f29-b6aa-c4622603e334-config-volume\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.428785 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df2e7069-f589-4f29-b6aa-c4622603e334-secret-volume\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.445866 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mn7f\" (UniqueName: \"kubernetes.io/projected/96522cae-d723-456d-be53-43bcdcda0146-kube-api-access-8mn7f\") pod \"auto-csr-approver-29567130-fhsx6\" (UID: \"96522cae-d723-456d-be53-43bcdcda0146\") " pod="openshift-infra/auto-csr-approver-29567130-fhsx6" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.460751 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ptv9\" (UniqueName: \"kubernetes.io/projected/df2e7069-f589-4f29-b6aa-c4622603e334-kube-api-access-2ptv9\") pod \"collect-profiles-29567130-vlcc2\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.461065 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-fhsx6" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.464090 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66cfbc7d76-5tcqn" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.521831 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.946557 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-fhsx6"] Mar 20 17:30:00 crc kubenswrapper[4803]: I0320 17:30:00.974098 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2"] Mar 20 17:30:00 crc kubenswrapper[4803]: W0320 17:30:00.975394 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf2e7069_f589_4f29_b6aa_c4622603e334.slice/crio-b68282f09c8a694919fddc11aaf9cccc1ced5bc9a88324040402da5d13d8da85 WatchSource:0}: Error finding container b68282f09c8a694919fddc11aaf9cccc1ced5bc9a88324040402da5d13d8da85: Status 404 returned error can't find the container with id b68282f09c8a694919fddc11aaf9cccc1ced5bc9a88324040402da5d13d8da85 Mar 20 17:30:01 crc kubenswrapper[4803]: I0320 17:30:01.383080 4803 generic.go:334] "Generic (PLEG): container finished" podID="df2e7069-f589-4f29-b6aa-c4622603e334" containerID="83900e21ba5143ee4fa6705d4b516b5e041200bd9912731f132257c91f5830c3" exitCode=0 Mar 20 17:30:01 crc kubenswrapper[4803]: I0320 17:30:01.383182 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" event={"ID":"df2e7069-f589-4f29-b6aa-c4622603e334","Type":"ContainerDied","Data":"83900e21ba5143ee4fa6705d4b516b5e041200bd9912731f132257c91f5830c3"} Mar 20 17:30:01 crc kubenswrapper[4803]: I0320 17:30:01.383443 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" event={"ID":"df2e7069-f589-4f29-b6aa-c4622603e334","Type":"ContainerStarted","Data":"b68282f09c8a694919fddc11aaf9cccc1ced5bc9a88324040402da5d13d8da85"} Mar 20 17:30:01 crc kubenswrapper[4803]: I0320 17:30:01.384277 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-fhsx6" event={"ID":"96522cae-d723-456d-be53-43bcdcda0146","Type":"ContainerStarted","Data":"e0e7bb9c7a5b040ce66be1e3031d03d576a0f84097fd00f3b2cacd263d3cb51a"} Mar 20 17:30:01 crc kubenswrapper[4803]: E0320 17:30:01.989135 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f49a52_830d_4717_820b_9c214238244d.slice/crio-fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040\": RecentStats: unable to find data in memory cache]" Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.672921 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.855085 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df2e7069-f589-4f29-b6aa-c4622603e334-secret-volume\") pod \"df2e7069-f589-4f29-b6aa-c4622603e334\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.855193 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df2e7069-f589-4f29-b6aa-c4622603e334-config-volume\") pod \"df2e7069-f589-4f29-b6aa-c4622603e334\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.855286 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ptv9\" (UniqueName: \"kubernetes.io/projected/df2e7069-f589-4f29-b6aa-c4622603e334-kube-api-access-2ptv9\") pod \"df2e7069-f589-4f29-b6aa-c4622603e334\" (UID: \"df2e7069-f589-4f29-b6aa-c4622603e334\") " Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.857260 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df2e7069-f589-4f29-b6aa-c4622603e334-config-volume" (OuterVolumeSpecName: "config-volume") pod "df2e7069-f589-4f29-b6aa-c4622603e334" (UID: "df2e7069-f589-4f29-b6aa-c4622603e334"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.870710 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df2e7069-f589-4f29-b6aa-c4622603e334-kube-api-access-2ptv9" (OuterVolumeSpecName: "kube-api-access-2ptv9") pod "df2e7069-f589-4f29-b6aa-c4622603e334" (UID: "df2e7069-f589-4f29-b6aa-c4622603e334"). InnerVolumeSpecName "kube-api-access-2ptv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.871379 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2e7069-f589-4f29-b6aa-c4622603e334-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "df2e7069-f589-4f29-b6aa-c4622603e334" (UID: "df2e7069-f589-4f29-b6aa-c4622603e334"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.956563 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ptv9\" (UniqueName: \"kubernetes.io/projected/df2e7069-f589-4f29-b6aa-c4622603e334-kube-api-access-2ptv9\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.956596 4803 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df2e7069-f589-4f29-b6aa-c4622603e334-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:02 crc kubenswrapper[4803]: I0320 17:30:02.956605 4803 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df2e7069-f589-4f29-b6aa-c4622603e334-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:03 crc kubenswrapper[4803]: I0320 17:30:03.407279 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" event={"ID":"df2e7069-f589-4f29-b6aa-c4622603e334","Type":"ContainerDied","Data":"b68282f09c8a694919fddc11aaf9cccc1ced5bc9a88324040402da5d13d8da85"} Mar 20 17:30:03 crc kubenswrapper[4803]: I0320 17:30:03.407511 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b68282f09c8a694919fddc11aaf9cccc1ced5bc9a88324040402da5d13d8da85" Mar 20 17:30:03 crc kubenswrapper[4803]: I0320 17:30:03.407323 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2" Mar 20 17:30:03 crc kubenswrapper[4803]: I0320 17:30:03.411169 4803 generic.go:334] "Generic (PLEG): container finished" podID="96522cae-d723-456d-be53-43bcdcda0146" containerID="911b1a988b08afad42a215b61511b297323939eaaf8a72533b8ea62161019f7b" exitCode=0 Mar 20 17:30:03 crc kubenswrapper[4803]: I0320 17:30:03.411221 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-fhsx6" event={"ID":"96522cae-d723-456d-be53-43bcdcda0146","Type":"ContainerDied","Data":"911b1a988b08afad42a215b61511b297323939eaaf8a72533b8ea62161019f7b"} Mar 20 17:30:04 crc kubenswrapper[4803]: I0320 17:30:04.790659 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-fhsx6" Mar 20 17:30:04 crc kubenswrapper[4803]: I0320 17:30:04.926045 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mn7f\" (UniqueName: \"kubernetes.io/projected/96522cae-d723-456d-be53-43bcdcda0146-kube-api-access-8mn7f\") pod \"96522cae-d723-456d-be53-43bcdcda0146\" (UID: \"96522cae-d723-456d-be53-43bcdcda0146\") " Mar 20 17:30:04 crc kubenswrapper[4803]: I0320 17:30:04.941698 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96522cae-d723-456d-be53-43bcdcda0146-kube-api-access-8mn7f" (OuterVolumeSpecName: "kube-api-access-8mn7f") pod "96522cae-d723-456d-be53-43bcdcda0146" (UID: "96522cae-d723-456d-be53-43bcdcda0146"). InnerVolumeSpecName "kube-api-access-8mn7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:05 crc kubenswrapper[4803]: I0320 17:30:05.027890 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mn7f\" (UniqueName: \"kubernetes.io/projected/96522cae-d723-456d-be53-43bcdcda0146-kube-api-access-8mn7f\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:05 crc kubenswrapper[4803]: I0320 17:30:05.428242 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567130-fhsx6" event={"ID":"96522cae-d723-456d-be53-43bcdcda0146","Type":"ContainerDied","Data":"e0e7bb9c7a5b040ce66be1e3031d03d576a0f84097fd00f3b2cacd263d3cb51a"} Mar 20 17:30:05 crc kubenswrapper[4803]: I0320 17:30:05.428296 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0e7bb9c7a5b040ce66be1e3031d03d576a0f84097fd00f3b2cacd263d3cb51a" Mar 20 17:30:05 crc kubenswrapper[4803]: I0320 17:30:05.428371 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567130-fhsx6" Mar 20 17:30:05 crc kubenswrapper[4803]: I0320 17:30:05.851254 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-tmtll"] Mar 20 17:30:05 crc kubenswrapper[4803]: I0320 17:30:05.865422 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567124-tmtll"] Mar 20 17:30:06 crc kubenswrapper[4803]: I0320 17:30:06.871739 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28cdfb98-cfe7-42a7-8ee0-84d618800c5b" path="/var/lib/kubelet/pods/28cdfb98-cfe7-42a7-8ee0-84d618800c5b/volumes" Mar 20 17:30:08 crc kubenswrapper[4803]: I0320 17:30:08.245756 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:30:08 crc kubenswrapper[4803]: I0320 17:30:08.245819 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:30:08 crc kubenswrapper[4803]: I0320 17:30:08.245868 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:30:08 crc kubenswrapper[4803]: I0320 17:30:08.246507 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b24a42c256c66e693667314faf1192fa4d816bc92003b13635ba6ca267c3a888"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:30:08 crc kubenswrapper[4803]: I0320 17:30:08.246611 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://b24a42c256c66e693667314faf1192fa4d816bc92003b13635ba6ca267c3a888" gracePeriod=600 Mar 20 17:30:08 crc kubenswrapper[4803]: I0320 17:30:08.457642 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="b24a42c256c66e693667314faf1192fa4d816bc92003b13635ba6ca267c3a888" exitCode=0 Mar 20 17:30:08 crc kubenswrapper[4803]: I0320 17:30:08.457716 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"b24a42c256c66e693667314faf1192fa4d816bc92003b13635ba6ca267c3a888"} Mar 20 17:30:08 crc kubenswrapper[4803]: I0320 17:30:08.457769 4803 scope.go:117] "RemoveContainer" containerID="dab0fb0f3d947efe9367eb4b30fc777ab2c8a308cb11da87dd8cf745d61564c4" Mar 20 17:30:09 crc kubenswrapper[4803]: I0320 17:30:09.510837 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"6b7da3dcd15f360247e71d17a90f38b394013be03e278a625e311daaaf87c2cd"} Mar 20 17:30:11 crc kubenswrapper[4803]: I0320 17:30:11.063952 4803 scope.go:117] "RemoveContainer" containerID="cc9b198f8d5a17014c5ef6d250ebdc17c859a9cdb5adb8c043470954f0e136c5" Mar 20 17:30:12 crc kubenswrapper[4803]: E0320 17:30:12.182676 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f49a52_830d_4717_820b_9c214238244d.slice/crio-fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040\": RecentStats: unable to find data in memory cache]" Mar 20 17:30:20 crc kubenswrapper[4803]: I0320 17:30:20.090364 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-599d9f9c9-jbh6h" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.707384 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wkwlf"] Mar 20 17:30:21 crc kubenswrapper[4803]: E0320 17:30:21.708177 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2e7069-f589-4f29-b6aa-c4622603e334" containerName="collect-profiles" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.708192 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2e7069-f589-4f29-b6aa-c4622603e334" containerName="collect-profiles" Mar 20 17:30:21 crc kubenswrapper[4803]: E0320 17:30:21.708211 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96522cae-d723-456d-be53-43bcdcda0146" containerName="oc" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.708220 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="96522cae-d723-456d-be53-43bcdcda0146" containerName="oc" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.708353 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2e7069-f589-4f29-b6aa-c4622603e334" containerName="collect-profiles" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.708372 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="96522cae-d723-456d-be53-43bcdcda0146" containerName="oc" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.710749 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj"] Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.711334 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.711865 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.714154 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.714240 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.716121 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.717343 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-b8rnn" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.745315 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj"] Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.778487 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qjwcn"] Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.779540 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qjwcn" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.780913 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.780975 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-q8pz6" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.780920 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.782337 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.809405 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-24x6b"] Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.810278 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.811889 4803 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817386 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cv92\" (UniqueName: \"kubernetes.io/projected/3181265c-ba7e-4f29-9950-bfefd81e98e5-kube-api-access-6cv92\") pod \"frr-k8s-webhook-server-bcc4b6f68-6nwbj\" (UID: \"3181265c-ba7e-4f29-9950-bfefd81e98e5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817425 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4367c661-d351-46c9-9a1f-87f039fe6458-frr-startup\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817447 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-metrics\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817490 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-frr-conf\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817538 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-frr-sockets\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817582 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3181265c-ba7e-4f29-9950-bfefd81e98e5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-6nwbj\" (UID: \"3181265c-ba7e-4f29-9950-bfefd81e98e5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817602 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4367c661-d351-46c9-9a1f-87f039fe6458-metrics-certs\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817618 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2xd4\" (UniqueName: \"kubernetes.io/projected/4367c661-d351-46c9-9a1f-87f039fe6458-kube-api-access-x2xd4\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.817663 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-reloader\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.821474 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-24x6b"] Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918330 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918380 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-metrics-certs\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918411 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-reloader\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918428 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b632859-081b-4be0-a3f6-9b91b4687ecf-metallb-excludel2\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918649 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cv92\" (UniqueName: \"kubernetes.io/projected/3181265c-ba7e-4f29-9950-bfefd81e98e5-kube-api-access-6cv92\") pod \"frr-k8s-webhook-server-bcc4b6f68-6nwbj\" (UID: \"3181265c-ba7e-4f29-9950-bfefd81e98e5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918716 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4367c661-d351-46c9-9a1f-87f039fe6458-frr-startup\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918757 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-metrics\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918831 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-frr-conf\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918871 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-metrics-certs\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918906 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-frr-sockets\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.918994 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3181265c-ba7e-4f29-9950-bfefd81e98e5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-6nwbj\" (UID: \"3181265c-ba7e-4f29-9950-bfefd81e98e5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919039 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89jd4\" (UniqueName: \"kubernetes.io/projected/6b632859-081b-4be0-a3f6-9b91b4687ecf-kube-api-access-89jd4\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919076 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-cert\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:21 crc kubenswrapper[4803]: E0320 17:30:21.919093 4803 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919112 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4367c661-d351-46c9-9a1f-87f039fe6458-metrics-certs\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: E0320 17:30:21.919135 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3181265c-ba7e-4f29-9950-bfefd81e98e5-cert podName:3181265c-ba7e-4f29-9950-bfefd81e98e5 nodeName:}" failed. No retries permitted until 2026-03-20 17:30:22.419121466 +0000 UTC m=+832.330713526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3181265c-ba7e-4f29-9950-bfefd81e98e5-cert") pod "frr-k8s-webhook-server-bcc4b6f68-6nwbj" (UID: "3181265c-ba7e-4f29-9950-bfefd81e98e5") : secret "frr-k8s-webhook-server-cert" not found Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919158 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2xd4\" (UniqueName: \"kubernetes.io/projected/4367c661-d351-46c9-9a1f-87f039fe6458-kube-api-access-x2xd4\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919197 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb52j\" (UniqueName: \"kubernetes.io/projected/06c64fc0-e716-455b-bff4-0aac055505a9-kube-api-access-sb52j\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919438 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-reloader\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919514 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-frr-conf\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919558 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-metrics\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919660 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4367c661-d351-46c9-9a1f-87f039fe6458-frr-startup\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.919837 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4367c661-d351-46c9-9a1f-87f039fe6458-frr-sockets\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.927229 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4367c661-d351-46c9-9a1f-87f039fe6458-metrics-certs\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.938125 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cv92\" (UniqueName: \"kubernetes.io/projected/3181265c-ba7e-4f29-9950-bfefd81e98e5-kube-api-access-6cv92\") pod \"frr-k8s-webhook-server-bcc4b6f68-6nwbj\" (UID: \"3181265c-ba7e-4f29-9950-bfefd81e98e5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:21 crc kubenswrapper[4803]: I0320 17:30:21.939340 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2xd4\" (UniqueName: \"kubernetes.io/projected/4367c661-d351-46c9-9a1f-87f039fe6458-kube-api-access-x2xd4\") pod \"frr-k8s-wkwlf\" (UID: \"4367c661-d351-46c9-9a1f-87f039fe6458\") " pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.020318 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89jd4\" (UniqueName: \"kubernetes.io/projected/6b632859-081b-4be0-a3f6-9b91b4687ecf-kube-api-access-89jd4\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.020366 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-cert\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.020391 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb52j\" (UniqueName: \"kubernetes.io/projected/06c64fc0-e716-455b-bff4-0aac055505a9-kube-api-access-sb52j\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.020426 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.020472 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-metrics-certs\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.020503 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b632859-081b-4be0-a3f6-9b91b4687ecf-metallb-excludel2\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.020630 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-metrics-certs\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:22 crc kubenswrapper[4803]: E0320 17:30:22.020762 4803 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 17:30:22 crc kubenswrapper[4803]: E0320 17:30:22.020816 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-metrics-certs podName:06c64fc0-e716-455b-bff4-0aac055505a9 nodeName:}" failed. No retries permitted until 2026-03-20 17:30:22.520797744 +0000 UTC m=+832.432389834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-metrics-certs") pod "controller-7bb4cc7c98-24x6b" (UID: "06c64fc0-e716-455b-bff4-0aac055505a9") : secret "controller-certs-secret" not found Mar 20 17:30:22 crc kubenswrapper[4803]: E0320 17:30:22.022168 4803 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 17:30:22 crc kubenswrapper[4803]: E0320 17:30:22.022241 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist podName:6b632859-081b-4be0-a3f6-9b91b4687ecf nodeName:}" failed. No retries permitted until 2026-03-20 17:30:22.522219605 +0000 UTC m=+832.433811675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist") pod "speaker-qjwcn" (UID: "6b632859-081b-4be0-a3f6-9b91b4687ecf") : secret "metallb-memberlist" not found Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.022864 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6b632859-081b-4be0-a3f6-9b91b4687ecf-metallb-excludel2\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.024607 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-metrics-certs\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.030049 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-cert\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.038729 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89jd4\" (UniqueName: \"kubernetes.io/projected/6b632859-081b-4be0-a3f6-9b91b4687ecf-kube-api-access-89jd4\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.047537 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.057128 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb52j\" (UniqueName: \"kubernetes.io/projected/06c64fc0-e716-455b-bff4-0aac055505a9-kube-api-access-sb52j\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:22 crc kubenswrapper[4803]: E0320 17:30:22.318894 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f49a52_830d_4717_820b_9c214238244d.slice/crio-fa04d81bbb185df00126c683a69b5f094685b363f0773f6dd112198481a6b040\": RecentStats: unable to find data in memory cache]" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.424466 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3181265c-ba7e-4f29-9950-bfefd81e98e5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-6nwbj\" (UID: \"3181265c-ba7e-4f29-9950-bfefd81e98e5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.428105 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3181265c-ba7e-4f29-9950-bfefd81e98e5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-6nwbj\" (UID: \"3181265c-ba7e-4f29-9950-bfefd81e98e5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.526489 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:22 crc kubenswrapper[4803]: E0320 17:30:22.526737 4803 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.526778 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-metrics-certs\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:22 crc kubenswrapper[4803]: E0320 17:30:22.526840 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist podName:6b632859-081b-4be0-a3f6-9b91b4687ecf nodeName:}" failed. No retries permitted until 2026-03-20 17:30:23.526814081 +0000 UTC m=+833.438406221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist") pod "speaker-qjwcn" (UID: "6b632859-081b-4be0-a3f6-9b91b4687ecf") : secret "metallb-memberlist" not found Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.531783 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/06c64fc0-e716-455b-bff4-0aac055505a9-metrics-certs\") pod \"controller-7bb4cc7c98-24x6b\" (UID: \"06c64fc0-e716-455b-bff4-0aac055505a9\") " pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.604575 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerStarted","Data":"1a2edcb9a8576390be8e80ed13ff0bd757c9e90945ab0c23bc11040318321cac"} Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.636738 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.722474 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:22 crc kubenswrapper[4803]: I0320 17:30:22.956287 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-24x6b"] Mar 20 17:30:22 crc kubenswrapper[4803]: W0320 17:30:22.964370 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06c64fc0_e716_455b_bff4_0aac055505a9.slice/crio-eea66463fdfe3ba1591c33a6d143bba0472bd6a692556937d525df5168b7ad37 WatchSource:0}: Error finding container eea66463fdfe3ba1591c33a6d143bba0472bd6a692556937d525df5168b7ad37: Status 404 returned error can't find the container with id eea66463fdfe3ba1591c33a6d143bba0472bd6a692556937d525df5168b7ad37 Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.116207 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj"] Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.552977 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.560154 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6b632859-081b-4be0-a3f6-9b91b4687ecf-memberlist\") pod \"speaker-qjwcn\" (UID: \"6b632859-081b-4be0-a3f6-9b91b4687ecf\") " pod="metallb-system/speaker-qjwcn" Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.594216 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qjwcn" Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.629517 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-24x6b" event={"ID":"06c64fc0-e716-455b-bff4-0aac055505a9","Type":"ContainerStarted","Data":"cc58ecef2d3c60ce1b26b30c0008e38a59a15b7a48fde7c84cdfe3b0d9798eee"} Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.629573 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-24x6b" event={"ID":"06c64fc0-e716-455b-bff4-0aac055505a9","Type":"ContainerStarted","Data":"281b59b8c17a4e30c904ce291e189f0eb6aa9d29e32a21ef3f685cf09549fcff"} Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.629583 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-24x6b" event={"ID":"06c64fc0-e716-455b-bff4-0aac055505a9","Type":"ContainerStarted","Data":"eea66463fdfe3ba1591c33a6d143bba0472bd6a692556937d525df5168b7ad37"} Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.629618 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.631190 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" event={"ID":"3181265c-ba7e-4f29-9950-bfefd81e98e5","Type":"ContainerStarted","Data":"32ee6e18736d1ba62c609480c04debc1aa2f8cbd7b19e0815a4b043027abdb33"} Mar 20 17:30:23 crc kubenswrapper[4803]: W0320 17:30:23.635095 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b632859_081b_4be0_a3f6_9b91b4687ecf.slice/crio-93617afdb759f723fdaeceb0eacea4fcd25c7a5cbe10e990b63a3aa0d6a8de50 WatchSource:0}: Error finding container 93617afdb759f723fdaeceb0eacea4fcd25c7a5cbe10e990b63a3aa0d6a8de50: Status 404 returned error can't find the container with id 93617afdb759f723fdaeceb0eacea4fcd25c7a5cbe10e990b63a3aa0d6a8de50 Mar 20 17:30:23 crc kubenswrapper[4803]: I0320 17:30:23.651120 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-24x6b" podStartSLOduration=2.651106193 podStartE2EDuration="2.651106193s" podCreationTimestamp="2026-03-20 17:30:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:30:23.647402585 +0000 UTC m=+833.558994655" watchObservedRunningTime="2026-03-20 17:30:23.651106193 +0000 UTC m=+833.562698263" Mar 20 17:30:24 crc kubenswrapper[4803]: I0320 17:30:24.642099 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qjwcn" event={"ID":"6b632859-081b-4be0-a3f6-9b91b4687ecf","Type":"ContainerStarted","Data":"06693f7da67707a9772386a5880381990880054fd663e1b9ff01056f2e798080"} Mar 20 17:30:24 crc kubenswrapper[4803]: I0320 17:30:24.642318 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qjwcn" event={"ID":"6b632859-081b-4be0-a3f6-9b91b4687ecf","Type":"ContainerStarted","Data":"114de9f0814bdfbb0d8058fa0c4eedce48d9cde5dc008210cac84d59673e8789"} Mar 20 17:30:24 crc kubenswrapper[4803]: I0320 17:30:24.642328 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qjwcn" event={"ID":"6b632859-081b-4be0-a3f6-9b91b4687ecf","Type":"ContainerStarted","Data":"93617afdb759f723fdaeceb0eacea4fcd25c7a5cbe10e990b63a3aa0d6a8de50"} Mar 20 17:30:24 crc kubenswrapper[4803]: I0320 17:30:24.642818 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qjwcn" Mar 20 17:30:24 crc kubenswrapper[4803]: I0320 17:30:24.664565 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qjwcn" podStartSLOduration=3.66455187 podStartE2EDuration="3.66455187s" podCreationTimestamp="2026-03-20 17:30:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:30:24.663122109 +0000 UTC m=+834.574714199" watchObservedRunningTime="2026-03-20 17:30:24.66455187 +0000 UTC m=+834.576143930" Mar 20 17:30:29 crc kubenswrapper[4803]: I0320 17:30:29.671870 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" event={"ID":"3181265c-ba7e-4f29-9950-bfefd81e98e5","Type":"ContainerStarted","Data":"81750f33684b1b4c795b973c43467a8e935b068c3efb1238bdc06c7562ff643a"} Mar 20 17:30:29 crc kubenswrapper[4803]: I0320 17:30:29.672492 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:29 crc kubenswrapper[4803]: I0320 17:30:29.673844 4803 generic.go:334] "Generic (PLEG): container finished" podID="4367c661-d351-46c9-9a1f-87f039fe6458" containerID="96aa84ee9dd283d53f36b6b2f106c0e85bc67173db62070e0d26e7ca5a7b67fb" exitCode=0 Mar 20 17:30:29 crc kubenswrapper[4803]: I0320 17:30:29.673908 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerDied","Data":"96aa84ee9dd283d53f36b6b2f106c0e85bc67173db62070e0d26e7ca5a7b67fb"} Mar 20 17:30:29 crc kubenswrapper[4803]: I0320 17:30:29.708505 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" podStartSLOduration=2.509013344 podStartE2EDuration="8.708476263s" podCreationTimestamp="2026-03-20 17:30:21 +0000 UTC" firstStartedPulling="2026-03-20 17:30:23.131438207 +0000 UTC m=+833.043030277" lastFinishedPulling="2026-03-20 17:30:29.330901116 +0000 UTC m=+839.242493196" observedRunningTime="2026-03-20 17:30:29.69301678 +0000 UTC m=+839.604608980" watchObservedRunningTime="2026-03-20 17:30:29.708476263 +0000 UTC m=+839.620068343" Mar 20 17:30:30 crc kubenswrapper[4803]: I0320 17:30:30.685436 4803 generic.go:334] "Generic (PLEG): container finished" podID="4367c661-d351-46c9-9a1f-87f039fe6458" containerID="920ca2c0870100eeb56ea944a07bfaca85d5676f899724482375ec1437ce71b3" exitCode=0 Mar 20 17:30:30 crc kubenswrapper[4803]: I0320 17:30:30.685604 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerDied","Data":"920ca2c0870100eeb56ea944a07bfaca85d5676f899724482375ec1437ce71b3"} Mar 20 17:30:31 crc kubenswrapper[4803]: I0320 17:30:31.696549 4803 generic.go:334] "Generic (PLEG): container finished" podID="4367c661-d351-46c9-9a1f-87f039fe6458" containerID="7930d8fa7a81e95d4e884ee72f1a6182317a7a669dacfe075378ab978e2636e0" exitCode=0 Mar 20 17:30:31 crc kubenswrapper[4803]: I0320 17:30:31.696655 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerDied","Data":"7930d8fa7a81e95d4e884ee72f1a6182317a7a669dacfe075378ab978e2636e0"} Mar 20 17:30:32 crc kubenswrapper[4803]: I0320 17:30:32.708641 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerStarted","Data":"ae794377ca11ee763bda1fb0f04d9693c7386a8c454b5ef173227d6a070a51d6"} Mar 20 17:30:32 crc kubenswrapper[4803]: I0320 17:30:32.709162 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerStarted","Data":"18b1953d6781978d8d4fbe3232a3bdf871d6710b7efacd04eacdffeb1c386d84"} Mar 20 17:30:32 crc kubenswrapper[4803]: I0320 17:30:32.709179 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerStarted","Data":"b56b1c31a7f8acc78b4b72c695ca90084b057dfd595a0204ba047532996ffe12"} Mar 20 17:30:32 crc kubenswrapper[4803]: I0320 17:30:32.709193 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerStarted","Data":"f5c6a66a1df3905b0aad545e69c3e1b3516422d08e1c30bbdfb0bb68b6ff0e6e"} Mar 20 17:30:32 crc kubenswrapper[4803]: I0320 17:30:32.709206 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerStarted","Data":"9fd132bfaddb9919ee2b89200629cfa11d2f1828fc831c8c19a9172044a60746"} Mar 20 17:30:33 crc kubenswrapper[4803]: I0320 17:30:33.597848 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qjwcn" Mar 20 17:30:33 crc kubenswrapper[4803]: I0320 17:30:33.721088 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wkwlf" event={"ID":"4367c661-d351-46c9-9a1f-87f039fe6458","Type":"ContainerStarted","Data":"933fe0ba2c317b319fe0c7ee65452ab848aa5dbbfc8b27fb37be690afad6adee"} Mar 20 17:30:33 crc kubenswrapper[4803]: I0320 17:30:33.751922 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wkwlf" podStartSLOduration=5.57072935 podStartE2EDuration="12.751883846s" podCreationTimestamp="2026-03-20 17:30:21 +0000 UTC" firstStartedPulling="2026-03-20 17:30:22.175636458 +0000 UTC m=+832.087228528" lastFinishedPulling="2026-03-20 17:30:29.356790944 +0000 UTC m=+839.268383024" observedRunningTime="2026-03-20 17:30:33.749668521 +0000 UTC m=+843.661260601" watchObservedRunningTime="2026-03-20 17:30:33.751883846 +0000 UTC m=+843.663475936" Mar 20 17:30:34 crc kubenswrapper[4803]: I0320 17:30:34.732022 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.048700 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.074138 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-x97qd"] Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.075718 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x97qd" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.085227 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.085246 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-p6m5c" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.085241 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.091196 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x97qd"] Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.156393 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.160275 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srv6t\" (UniqueName: \"kubernetes.io/projected/1741d1cd-5699-4fb2-b28d-dbd7744242d8-kube-api-access-srv6t\") pod \"openstack-operator-index-x97qd\" (UID: \"1741d1cd-5699-4fb2-b28d-dbd7744242d8\") " pod="openstack-operators/openstack-operator-index-x97qd" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.261047 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srv6t\" (UniqueName: \"kubernetes.io/projected/1741d1cd-5699-4fb2-b28d-dbd7744242d8-kube-api-access-srv6t\") pod \"openstack-operator-index-x97qd\" (UID: \"1741d1cd-5699-4fb2-b28d-dbd7744242d8\") " pod="openstack-operators/openstack-operator-index-x97qd" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.282489 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srv6t\" (UniqueName: \"kubernetes.io/projected/1741d1cd-5699-4fb2-b28d-dbd7744242d8-kube-api-access-srv6t\") pod \"openstack-operator-index-x97qd\" (UID: \"1741d1cd-5699-4fb2-b28d-dbd7744242d8\") " pod="openstack-operators/openstack-operator-index-x97qd" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.398506 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x97qd" Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.678734 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-x97qd"] Mar 20 17:30:37 crc kubenswrapper[4803]: I0320 17:30:37.753886 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x97qd" event={"ID":"1741d1cd-5699-4fb2-b28d-dbd7744242d8","Type":"ContainerStarted","Data":"19458895637fe857214ab2e0d45b082a959ad21adafe86776a173ce0e3f6d787"} Mar 20 17:30:40 crc kubenswrapper[4803]: I0320 17:30:40.428452 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-x97qd"] Mar 20 17:30:40 crc kubenswrapper[4803]: I0320 17:30:40.775082 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x97qd" event={"ID":"1741d1cd-5699-4fb2-b28d-dbd7744242d8","Type":"ContainerStarted","Data":"206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2"} Mar 20 17:30:40 crc kubenswrapper[4803]: I0320 17:30:40.800059 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-x97qd" podStartSLOduration=1.431021417 podStartE2EDuration="3.800035118s" podCreationTimestamp="2026-03-20 17:30:37 +0000 UTC" firstStartedPulling="2026-03-20 17:30:37.696181708 +0000 UTC m=+847.607773788" lastFinishedPulling="2026-03-20 17:30:40.065195409 +0000 UTC m=+849.976787489" observedRunningTime="2026-03-20 17:30:40.797977448 +0000 UTC m=+850.709569578" watchObservedRunningTime="2026-03-20 17:30:40.800035118 +0000 UTC m=+850.711627198" Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.050864 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-sbnxs"] Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.052597 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.061934 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sbnxs"] Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.239512 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8llv\" (UniqueName: \"kubernetes.io/projected/e3b2e374-c317-4d96-80e5-f3e274b31ea8-kube-api-access-n8llv\") pod \"openstack-operator-index-sbnxs\" (UID: \"e3b2e374-c317-4d96-80e5-f3e274b31ea8\") " pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.340774 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8llv\" (UniqueName: \"kubernetes.io/projected/e3b2e374-c317-4d96-80e5-f3e274b31ea8-kube-api-access-n8llv\") pod \"openstack-operator-index-sbnxs\" (UID: \"e3b2e374-c317-4d96-80e5-f3e274b31ea8\") " pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.372291 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8llv\" (UniqueName: \"kubernetes.io/projected/e3b2e374-c317-4d96-80e5-f3e274b31ea8-kube-api-access-n8llv\") pod \"openstack-operator-index-sbnxs\" (UID: \"e3b2e374-c317-4d96-80e5-f3e274b31ea8\") " pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.408788 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.676846 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-sbnxs"] Mar 20 17:30:41 crc kubenswrapper[4803]: W0320 17:30:41.681960 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3b2e374_c317_4d96_80e5_f3e274b31ea8.slice/crio-b136da8f3762cf6cda1fd56f66b534a6496780fb841bea26b1b0509fd19aa6d4 WatchSource:0}: Error finding container b136da8f3762cf6cda1fd56f66b534a6496780fb841bea26b1b0509fd19aa6d4: Status 404 returned error can't find the container with id b136da8f3762cf6cda1fd56f66b534a6496780fb841bea26b1b0509fd19aa6d4 Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.783402 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sbnxs" event={"ID":"e3b2e374-c317-4d96-80e5-f3e274b31ea8","Type":"ContainerStarted","Data":"b136da8f3762cf6cda1fd56f66b534a6496780fb841bea26b1b0509fd19aa6d4"} Mar 20 17:30:41 crc kubenswrapper[4803]: I0320 17:30:41.783588 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-x97qd" podUID="1741d1cd-5699-4fb2-b28d-dbd7744242d8" containerName="registry-server" containerID="cri-o://206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2" gracePeriod=2 Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.050802 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wkwlf" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.154509 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x97qd" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.353809 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srv6t\" (UniqueName: \"kubernetes.io/projected/1741d1cd-5699-4fb2-b28d-dbd7744242d8-kube-api-access-srv6t\") pod \"1741d1cd-5699-4fb2-b28d-dbd7744242d8\" (UID: \"1741d1cd-5699-4fb2-b28d-dbd7744242d8\") " Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.363187 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1741d1cd-5699-4fb2-b28d-dbd7744242d8-kube-api-access-srv6t" (OuterVolumeSpecName: "kube-api-access-srv6t") pod "1741d1cd-5699-4fb2-b28d-dbd7744242d8" (UID: "1741d1cd-5699-4fb2-b28d-dbd7744242d8"). InnerVolumeSpecName "kube-api-access-srv6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.455003 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srv6t\" (UniqueName: \"kubernetes.io/projected/1741d1cd-5699-4fb2-b28d-dbd7744242d8-kube-api-access-srv6t\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.641645 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-6nwbj" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.726241 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-24x6b" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.792444 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-sbnxs" event={"ID":"e3b2e374-c317-4d96-80e5-f3e274b31ea8","Type":"ContainerStarted","Data":"e6f02953fc66baa18c425afd535de5049940ae32e3cf672dfc019218667ca25e"} Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.794124 4803 generic.go:334] "Generic (PLEG): container finished" podID="1741d1cd-5699-4fb2-b28d-dbd7744242d8" containerID="206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2" exitCode=0 Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.794170 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x97qd" event={"ID":"1741d1cd-5699-4fb2-b28d-dbd7744242d8","Type":"ContainerDied","Data":"206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2"} Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.794207 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-x97qd" event={"ID":"1741d1cd-5699-4fb2-b28d-dbd7744242d8","Type":"ContainerDied","Data":"19458895637fe857214ab2e0d45b082a959ad21adafe86776a173ce0e3f6d787"} Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.794239 4803 scope.go:117] "RemoveContainer" containerID="206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.794257 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-x97qd" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.814008 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-sbnxs" podStartSLOduration=1.756053756 podStartE2EDuration="1.813992403s" podCreationTimestamp="2026-03-20 17:30:41 +0000 UTC" firstStartedPulling="2026-03-20 17:30:41.68561837 +0000 UTC m=+851.597210450" lastFinishedPulling="2026-03-20 17:30:41.743557017 +0000 UTC m=+851.655149097" observedRunningTime="2026-03-20 17:30:42.813189849 +0000 UTC m=+852.724781919" watchObservedRunningTime="2026-03-20 17:30:42.813992403 +0000 UTC m=+852.725584473" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.828792 4803 scope.go:117] "RemoveContainer" containerID="206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2" Mar 20 17:30:42 crc kubenswrapper[4803]: E0320 17:30:42.829563 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2\": container with ID starting with 206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2 not found: ID does not exist" containerID="206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.829604 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2"} err="failed to get container status \"206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2\": rpc error: code = NotFound desc = could not find container \"206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2\": container with ID starting with 206434783d1f49aec5c69dc59ead7ef3fff9ca7f62b15c487668f4c7a55a0bb2 not found: ID does not exist" Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.860371 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-x97qd"] Mar 20 17:30:42 crc kubenswrapper[4803]: I0320 17:30:42.860995 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-x97qd"] Mar 20 17:30:44 crc kubenswrapper[4803]: I0320 17:30:44.863987 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1741d1cd-5699-4fb2-b28d-dbd7744242d8" path="/var/lib/kubelet/pods/1741d1cd-5699-4fb2-b28d-dbd7744242d8/volumes" Mar 20 17:30:51 crc kubenswrapper[4803]: I0320 17:30:51.409169 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:51 crc kubenswrapper[4803]: I0320 17:30:51.409605 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:51 crc kubenswrapper[4803]: I0320 17:30:51.450359 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:51 crc kubenswrapper[4803]: I0320 17:30:51.899373 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-sbnxs" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.089683 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx"] Mar 20 17:30:53 crc kubenswrapper[4803]: E0320 17:30:53.090329 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1741d1cd-5699-4fb2-b28d-dbd7744242d8" containerName="registry-server" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.090350 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="1741d1cd-5699-4fb2-b28d-dbd7744242d8" containerName="registry-server" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.090607 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="1741d1cd-5699-4fb2-b28d-dbd7744242d8" containerName="registry-server" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.092006 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.094937 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-pcc6n" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.116116 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx"] Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.285891 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgrs8\" (UniqueName: \"kubernetes.io/projected/52faafec-fb6f-44ed-8ac3-22022b6fb95e-kube-api-access-jgrs8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.285975 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.286042 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.388855 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgrs8\" (UniqueName: \"kubernetes.io/projected/52faafec-fb6f-44ed-8ac3-22022b6fb95e-kube-api-access-jgrs8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.388956 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.389048 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.389987 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-util\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.390132 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-bundle\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.422099 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgrs8\" (UniqueName: \"kubernetes.io/projected/52faafec-fb6f-44ed-8ac3-22022b6fb95e-kube-api-access-jgrs8\") pod \"9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.464930 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:53 crc kubenswrapper[4803]: I0320 17:30:53.929926 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx"] Mar 20 17:30:53 crc kubenswrapper[4803]: W0320 17:30:53.939987 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52faafec_fb6f_44ed_8ac3_22022b6fb95e.slice/crio-ff329bd8374314d916a65511205344f7425a35d1030a89cfa9823584317d721d WatchSource:0}: Error finding container ff329bd8374314d916a65511205344f7425a35d1030a89cfa9823584317d721d: Status 404 returned error can't find the container with id ff329bd8374314d916a65511205344f7425a35d1030a89cfa9823584317d721d Mar 20 17:30:54 crc kubenswrapper[4803]: I0320 17:30:54.900521 4803 generic.go:334] "Generic (PLEG): container finished" podID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerID="04e3680eca88950b2208c46416c2a8164ca14c06e1f4e2542f8b49f584608bd6" exitCode=0 Mar 20 17:30:54 crc kubenswrapper[4803]: I0320 17:30:54.900632 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" event={"ID":"52faafec-fb6f-44ed-8ac3-22022b6fb95e","Type":"ContainerDied","Data":"04e3680eca88950b2208c46416c2a8164ca14c06e1f4e2542f8b49f584608bd6"} Mar 20 17:30:54 crc kubenswrapper[4803]: I0320 17:30:54.900693 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" event={"ID":"52faafec-fb6f-44ed-8ac3-22022b6fb95e","Type":"ContainerStarted","Data":"ff329bd8374314d916a65511205344f7425a35d1030a89cfa9823584317d721d"} Mar 20 17:30:55 crc kubenswrapper[4803]: I0320 17:30:55.911806 4803 generic.go:334] "Generic (PLEG): container finished" podID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerID="876a2b57503aa7a3ac41a8a6d9cf48b17c0d88b84e36d51d2a83e1eb8bcd2f08" exitCode=0 Mar 20 17:30:55 crc kubenswrapper[4803]: I0320 17:30:55.912093 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" event={"ID":"52faafec-fb6f-44ed-8ac3-22022b6fb95e","Type":"ContainerDied","Data":"876a2b57503aa7a3ac41a8a6d9cf48b17c0d88b84e36d51d2a83e1eb8bcd2f08"} Mar 20 17:30:56 crc kubenswrapper[4803]: I0320 17:30:56.927021 4803 generic.go:334] "Generic (PLEG): container finished" podID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerID="6d33e0b7a15db9f57f7cc6f6a5a914d9d46d87f00ca547c7b451e32479484d96" exitCode=0 Mar 20 17:30:56 crc kubenswrapper[4803]: I0320 17:30:56.927069 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" event={"ID":"52faafec-fb6f-44ed-8ac3-22022b6fb95e","Type":"ContainerDied","Data":"6d33e0b7a15db9f57f7cc6f6a5a914d9d46d87f00ca547c7b451e32479484d96"} Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.340302 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.467996 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgrs8\" (UniqueName: \"kubernetes.io/projected/52faafec-fb6f-44ed-8ac3-22022b6fb95e-kube-api-access-jgrs8\") pod \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.468060 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-util\") pod \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.468152 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-bundle\") pod \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\" (UID: \"52faafec-fb6f-44ed-8ac3-22022b6fb95e\") " Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.469794 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-bundle" (OuterVolumeSpecName: "bundle") pod "52faafec-fb6f-44ed-8ac3-22022b6fb95e" (UID: "52faafec-fb6f-44ed-8ac3-22022b6fb95e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.477567 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52faafec-fb6f-44ed-8ac3-22022b6fb95e-kube-api-access-jgrs8" (OuterVolumeSpecName: "kube-api-access-jgrs8") pod "52faafec-fb6f-44ed-8ac3-22022b6fb95e" (UID: "52faafec-fb6f-44ed-8ac3-22022b6fb95e"). InnerVolumeSpecName "kube-api-access-jgrs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.482697 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-util" (OuterVolumeSpecName: "util") pod "52faafec-fb6f-44ed-8ac3-22022b6fb95e" (UID: "52faafec-fb6f-44ed-8ac3-22022b6fb95e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.569812 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgrs8\" (UniqueName: \"kubernetes.io/projected/52faafec-fb6f-44ed-8ac3-22022b6fb95e-kube-api-access-jgrs8\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.569881 4803 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-util\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.569907 4803 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52faafec-fb6f-44ed-8ac3-22022b6fb95e-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.947977 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" event={"ID":"52faafec-fb6f-44ed-8ac3-22022b6fb95e","Type":"ContainerDied","Data":"ff329bd8374314d916a65511205344f7425a35d1030a89cfa9823584317d721d"} Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.948035 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff329bd8374314d916a65511205344f7425a35d1030a89cfa9823584317d721d" Mar 20 17:30:58 crc kubenswrapper[4803]: I0320 17:30:58.948091 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.144912 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt"] Mar 20 17:31:05 crc kubenswrapper[4803]: E0320 17:31:05.146149 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerName="extract" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.146162 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerName="extract" Mar 20 17:31:05 crc kubenswrapper[4803]: E0320 17:31:05.146181 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerName="util" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.146186 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerName="util" Mar 20 17:31:05 crc kubenswrapper[4803]: E0320 17:31:05.146200 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerName="pull" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.146206 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerName="pull" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.146301 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="52faafec-fb6f-44ed-8ac3-22022b6fb95e" containerName="extract" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.146694 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.149303 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-7tlxz" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.168333 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt"] Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.268900 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzx2z\" (UniqueName: \"kubernetes.io/projected/c7141176-2b0e-4fb1-8c92-a424c769e059-kube-api-access-xzx2z\") pod \"openstack-operator-controller-init-65b67cc5c9-9mrzt\" (UID: \"c7141176-2b0e-4fb1-8c92-a424c769e059\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.370724 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzx2z\" (UniqueName: \"kubernetes.io/projected/c7141176-2b0e-4fb1-8c92-a424c769e059-kube-api-access-xzx2z\") pod \"openstack-operator-controller-init-65b67cc5c9-9mrzt\" (UID: \"c7141176-2b0e-4fb1-8c92-a424c769e059\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.402735 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzx2z\" (UniqueName: \"kubernetes.io/projected/c7141176-2b0e-4fb1-8c92-a424c769e059-kube-api-access-xzx2z\") pod \"openstack-operator-controller-init-65b67cc5c9-9mrzt\" (UID: \"c7141176-2b0e-4fb1-8c92-a424c769e059\") " pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" Mar 20 17:31:05 crc kubenswrapper[4803]: I0320 17:31:05.478842 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" Mar 20 17:31:06 crc kubenswrapper[4803]: I0320 17:31:06.001137 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt"] Mar 20 17:31:06 crc kubenswrapper[4803]: W0320 17:31:06.004187 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7141176_2b0e_4fb1_8c92_a424c769e059.slice/crio-e936c9d7f9e527c98b5da32c4579e68f924ef0f3a48892717ac138209775fee4 WatchSource:0}: Error finding container e936c9d7f9e527c98b5da32c4579e68f924ef0f3a48892717ac138209775fee4: Status 404 returned error can't find the container with id e936c9d7f9e527c98b5da32c4579e68f924ef0f3a48892717ac138209775fee4 Mar 20 17:31:06 crc kubenswrapper[4803]: I0320 17:31:06.034698 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" event={"ID":"c7141176-2b0e-4fb1-8c92-a424c769e059","Type":"ContainerStarted","Data":"e936c9d7f9e527c98b5da32c4579e68f924ef0f3a48892717ac138209775fee4"} Mar 20 17:31:10 crc kubenswrapper[4803]: I0320 17:31:10.082475 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" event={"ID":"c7141176-2b0e-4fb1-8c92-a424c769e059","Type":"ContainerStarted","Data":"a9f97a35c5ff972adefe6daf654a36872b4aabd34d6b2ca5b73b23f950842c1e"} Mar 20 17:31:10 crc kubenswrapper[4803]: I0320 17:31:10.082930 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" Mar 20 17:31:10 crc kubenswrapper[4803]: I0320 17:31:10.114100 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" podStartSLOduration=1.462298674 podStartE2EDuration="5.11408409s" podCreationTimestamp="2026-03-20 17:31:05 +0000 UTC" firstStartedPulling="2026-03-20 17:31:06.007780795 +0000 UTC m=+875.919372905" lastFinishedPulling="2026-03-20 17:31:09.659566241 +0000 UTC m=+879.571158321" observedRunningTime="2026-03-20 17:31:10.108488146 +0000 UTC m=+880.020080286" watchObservedRunningTime="2026-03-20 17:31:10.11408409 +0000 UTC m=+880.025676160" Mar 20 17:31:15 crc kubenswrapper[4803]: I0320 17:31:15.482981 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65b67cc5c9-9mrzt" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.126442 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.127864 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" Mar 20 17:31:33 crc kubenswrapper[4803]: W0320 17:31:33.129369 4803 reflector.go:561] object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-b2flt": failed to list *v1.Secret: secrets "barbican-operator-controller-manager-dockercfg-b2flt" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 20 17:31:33 crc kubenswrapper[4803]: E0320 17:31:33.129450 4803 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"barbican-operator-controller-manager-dockercfg-b2flt\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"barbican-operator-controller-manager-dockercfg-b2flt\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.147094 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.148497 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.149975 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-zhdwj" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.162128 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.163102 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.167551 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.171948 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-w6rjt" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.178644 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.179998 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.196004 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-mg7r2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.202534 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.235686 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.266653 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svmcx\" (UniqueName: \"kubernetes.io/projected/fefd7fa6-fd31-4b28-a7e9-1a4e630070fe-kube-api-access-svmcx\") pod \"barbican-operator-controller-manager-59bc569d95-8gf25\" (UID: \"fefd7fa6-fd31-4b28-a7e9-1a4e630070fe\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.266725 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfzf\" (UniqueName: \"kubernetes.io/projected/b3eec30a-a8b9-40b8-a786-16a339efe990-kube-api-access-xqfzf\") pod \"glance-operator-controller-manager-79df6bcc97-lxz9k\" (UID: \"b3eec30a-a8b9-40b8-a786-16a339efe990\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.266755 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq2lr\" (UniqueName: \"kubernetes.io/projected/524f49fa-3d73-4089-aa72-cbcfdfbed979-kube-api-access-tq2lr\") pod \"cinder-operator-controller-manager-8d58dc466-7w5w2\" (UID: \"524f49fa-3d73-4089-aa72-cbcfdfbed979\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.266786 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8jw4\" (UniqueName: \"kubernetes.io/projected/cfe324bc-b8c8-4971-b1d5-ed9df499771f-kube-api-access-t8jw4\") pod \"designate-operator-controller-manager-588d4d986b-b7sz6\" (UID: \"cfe324bc-b8c8-4971-b1d5-ed9df499771f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.269285 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.283671 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.284501 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.287795 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-w7ptw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.292535 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.293561 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.298022 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.299626 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mqrjx" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.306780 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.307749 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.309834 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-mw6p5" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.309967 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.313592 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.319496 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.323981 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.324685 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.330592 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8qkrh" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.330783 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.335002 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.335808 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.340828 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-rvktc" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.349826 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.353533 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.354301 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.356725 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vt5h5" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.365836 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.368817 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svmcx\" (UniqueName: \"kubernetes.io/projected/fefd7fa6-fd31-4b28-a7e9-1a4e630070fe-kube-api-access-svmcx\") pod \"barbican-operator-controller-manager-59bc569d95-8gf25\" (UID: \"fefd7fa6-fd31-4b28-a7e9-1a4e630070fe\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.368881 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqfzf\" (UniqueName: \"kubernetes.io/projected/b3eec30a-a8b9-40b8-a786-16a339efe990-kube-api-access-xqfzf\") pod \"glance-operator-controller-manager-79df6bcc97-lxz9k\" (UID: \"b3eec30a-a8b9-40b8-a786-16a339efe990\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.368901 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq2lr\" (UniqueName: \"kubernetes.io/projected/524f49fa-3d73-4089-aa72-cbcfdfbed979-kube-api-access-tq2lr\") pod \"cinder-operator-controller-manager-8d58dc466-7w5w2\" (UID: \"524f49fa-3d73-4089-aa72-cbcfdfbed979\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.368923 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8jw4\" (UniqueName: \"kubernetes.io/projected/cfe324bc-b8c8-4971-b1d5-ed9df499771f-kube-api-access-t8jw4\") pod \"designate-operator-controller-manager-588d4d986b-b7sz6\" (UID: \"cfe324bc-b8c8-4971-b1d5-ed9df499771f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.368968 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g8xb\" (UniqueName: \"kubernetes.io/projected/25fa5c58-b9f7-4cd8-b1c4-41c190df40f1-kube-api-access-2g8xb\") pod \"horizon-operator-controller-manager-8464cc45fb-smx6n\" (UID: \"25fa5c58-b9f7-4cd8-b1c4-41c190df40f1\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.369001 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkshq\" (UniqueName: \"kubernetes.io/projected/10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5-kube-api-access-gkshq\") pod \"heat-operator-controller-manager-67dd5f86f5-b5q6h\" (UID: \"10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.375702 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.376470 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.384079 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6qnpw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.386426 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.387803 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.393731 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-mvw8r" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.415686 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqfzf\" (UniqueName: \"kubernetes.io/projected/b3eec30a-a8b9-40b8-a786-16a339efe990-kube-api-access-xqfzf\") pod \"glance-operator-controller-manager-79df6bcc97-lxz9k\" (UID: \"b3eec30a-a8b9-40b8-a786-16a339efe990\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.418566 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.419899 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8jw4\" (UniqueName: \"kubernetes.io/projected/cfe324bc-b8c8-4971-b1d5-ed9df499771f-kube-api-access-t8jw4\") pod \"designate-operator-controller-manager-588d4d986b-b7sz6\" (UID: \"cfe324bc-b8c8-4971-b1d5-ed9df499771f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.420077 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svmcx\" (UniqueName: \"kubernetes.io/projected/fefd7fa6-fd31-4b28-a7e9-1a4e630070fe-kube-api-access-svmcx\") pod \"barbican-operator-controller-manager-59bc569d95-8gf25\" (UID: \"fefd7fa6-fd31-4b28-a7e9-1a4e630070fe\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.420349 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq2lr\" (UniqueName: \"kubernetes.io/projected/524f49fa-3d73-4089-aa72-cbcfdfbed979-kube-api-access-tq2lr\") pod \"cinder-operator-controller-manager-8d58dc466-7w5w2\" (UID: \"524f49fa-3d73-4089-aa72-cbcfdfbed979\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.442584 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.465894 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.473846 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrwbk\" (UniqueName: \"kubernetes.io/projected/23dabc24-9851-4232-8f79-56c8615246c7-kube-api-access-vrwbk\") pod \"manila-operator-controller-manager-55f864c847-wgxn7\" (UID: \"23dabc24-9851-4232-8f79-56c8615246c7\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.473877 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xzhg\" (UniqueName: \"kubernetes.io/projected/7a5c980d-5ccf-4e9d-9687-3119240ecc15-kube-api-access-6xzhg\") pod \"ironic-operator-controller-manager-6f787dddc9-cqwk9\" (UID: \"7a5c980d-5ccf-4e9d-9687-3119240ecc15\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.473918 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g8xb\" (UniqueName: \"kubernetes.io/projected/25fa5c58-b9f7-4cd8-b1c4-41c190df40f1-kube-api-access-2g8xb\") pod \"horizon-operator-controller-manager-8464cc45fb-smx6n\" (UID: \"25fa5c58-b9f7-4cd8-b1c4-41c190df40f1\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.473950 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.473986 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnk5c\" (UniqueName: \"kubernetes.io/projected/b01812c6-e35c-4699-8b7b-192a425bf0ce-kube-api-access-cnk5c\") pod \"mariadb-operator-controller-manager-67ccfc9778-6pfw8\" (UID: \"b01812c6-e35c-4699-8b7b-192a425bf0ce\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.474015 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkshq\" (UniqueName: \"kubernetes.io/projected/10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5-kube-api-access-gkshq\") pod \"heat-operator-controller-manager-67dd5f86f5-b5q6h\" (UID: \"10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.474065 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94sxz\" (UniqueName: \"kubernetes.io/projected/5ea17b5b-3363-4c08-a7e3-52cbb4cb5616-kube-api-access-94sxz\") pod \"keystone-operator-controller-manager-768b96df4c-k8594\" (UID: \"5ea17b5b-3363-4c08-a7e3-52cbb4cb5616\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.474088 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks7m8\" (UniqueName: \"kubernetes.io/projected/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-kube-api-access-ks7m8\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.474104 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpffq\" (UniqueName: \"kubernetes.io/projected/75cb9425-bd1a-4311-a85f-76eee943e0e9-kube-api-access-rpffq\") pod \"neutron-operator-controller-manager-767865f676-xwbd2\" (UID: \"75cb9425-bd1a-4311-a85f-76eee943e0e9\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.476177 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.477898 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.488395 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ph79z" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.488901 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.492877 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.495211 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.496353 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.497569 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g8xb\" (UniqueName: \"kubernetes.io/projected/25fa5c58-b9f7-4cd8-b1c4-41c190df40f1-kube-api-access-2g8xb\") pod \"horizon-operator-controller-manager-8464cc45fb-smx6n\" (UID: \"25fa5c58-b9f7-4cd8-b1c4-41c190df40f1\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.499685 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cm54m" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.500404 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkshq\" (UniqueName: \"kubernetes.io/projected/10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5-kube-api-access-gkshq\") pod \"heat-operator-controller-manager-67dd5f86f5-b5q6h\" (UID: \"10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.512022 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.512062 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.539284 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.541054 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.543993 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-2w2rf" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.545468 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.575464 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.576865 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrwbk\" (UniqueName: \"kubernetes.io/projected/23dabc24-9851-4232-8f79-56c8615246c7-kube-api-access-vrwbk\") pod \"manila-operator-controller-manager-55f864c847-wgxn7\" (UID: \"23dabc24-9851-4232-8f79-56c8615246c7\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.576917 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xzhg\" (UniqueName: \"kubernetes.io/projected/7a5c980d-5ccf-4e9d-9687-3119240ecc15-kube-api-access-6xzhg\") pod \"ironic-operator-controller-manager-6f787dddc9-cqwk9\" (UID: \"7a5c980d-5ccf-4e9d-9687-3119240ecc15\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.576972 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.577003 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnk5c\" (UniqueName: \"kubernetes.io/projected/b01812c6-e35c-4699-8b7b-192a425bf0ce-kube-api-access-cnk5c\") pod \"mariadb-operator-controller-manager-67ccfc9778-6pfw8\" (UID: \"b01812c6-e35c-4699-8b7b-192a425bf0ce\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.577062 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtpc\" (UniqueName: \"kubernetes.io/projected/e5270e18-ac4b-4f4d-8a6d-085699034cfe-kube-api-access-bvtpc\") pod \"octavia-operator-controller-manager-5b9f45d989-vmmmv\" (UID: \"e5270e18-ac4b-4f4d-8a6d-085699034cfe\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.577112 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcfzx\" (UniqueName: \"kubernetes.io/projected/ff62aaff-bfdc-400e-b6ee-217356ba1a23-kube-api-access-gcfzx\") pod \"nova-operator-controller-manager-5d488d59fb-9dwg8\" (UID: \"ff62aaff-bfdc-400e-b6ee-217356ba1a23\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.577167 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94sxz\" (UniqueName: \"kubernetes.io/projected/5ea17b5b-3363-4c08-a7e3-52cbb4cb5616-kube-api-access-94sxz\") pod \"keystone-operator-controller-manager-768b96df4c-k8594\" (UID: \"5ea17b5b-3363-4c08-a7e3-52cbb4cb5616\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.577213 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ks7m8\" (UniqueName: \"kubernetes.io/projected/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-kube-api-access-ks7m8\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:33 crc kubenswrapper[4803]: E0320 17:31:33.577633 4803 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:33 crc kubenswrapper[4803]: E0320 17:31:33.577704 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert podName:0dbcced3-01dd-45bb-8f6f-1733abb4f7db nodeName:}" failed. No retries permitted until 2026-03-20 17:31:34.077682407 +0000 UTC m=+903.989274477 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-qxc8h" (UID: "0dbcced3-01dd-45bb-8f6f-1733abb4f7db") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.578276 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpffq\" (UniqueName: \"kubernetes.io/projected/75cb9425-bd1a-4311-a85f-76eee943e0e9-kube-api-access-rpffq\") pod \"neutron-operator-controller-manager-767865f676-xwbd2\" (UID: \"75cb9425-bd1a-4311-a85f-76eee943e0e9\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.624906 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.626341 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-lktjh"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.628855 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.629942 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.634062 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-znpvp" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.635114 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94sxz\" (UniqueName: \"kubernetes.io/projected/5ea17b5b-3363-4c08-a7e3-52cbb4cb5616-kube-api-access-94sxz\") pod \"keystone-operator-controller-manager-768b96df4c-k8594\" (UID: \"5ea17b5b-3363-4c08-a7e3-52cbb4cb5616\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.637514 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnk5c\" (UniqueName: \"kubernetes.io/projected/b01812c6-e35c-4699-8b7b-192a425bf0ce-kube-api-access-cnk5c\") pod \"mariadb-operator-controller-manager-67ccfc9778-6pfw8\" (UID: \"b01812c6-e35c-4699-8b7b-192a425bf0ce\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.641003 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xzhg\" (UniqueName: \"kubernetes.io/projected/7a5c980d-5ccf-4e9d-9687-3119240ecc15-kube-api-access-6xzhg\") pod \"ironic-operator-controller-manager-6f787dddc9-cqwk9\" (UID: \"7a5c980d-5ccf-4e9d-9687-3119240ecc15\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.665107 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.666808 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks7m8\" (UniqueName: \"kubernetes.io/projected/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-kube-api-access-ks7m8\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.669264 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.679177 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8zhh\" (UniqueName: \"kubernetes.io/projected/002615ba-1b17-467b-a536-7a631c5b434e-kube-api-access-t8zhh\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.679224 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.679319 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvtpc\" (UniqueName: \"kubernetes.io/projected/e5270e18-ac4b-4f4d-8a6d-085699034cfe-kube-api-access-bvtpc\") pod \"octavia-operator-controller-manager-5b9f45d989-vmmmv\" (UID: \"e5270e18-ac4b-4f4d-8a6d-085699034cfe\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.679345 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcfzx\" (UniqueName: \"kubernetes.io/projected/ff62aaff-bfdc-400e-b6ee-217356ba1a23-kube-api-access-gcfzx\") pod \"nova-operator-controller-manager-5d488d59fb-9dwg8\" (UID: \"ff62aaff-bfdc-400e-b6ee-217356ba1a23\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.679368 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcp9q\" (UniqueName: \"kubernetes.io/projected/b6b15ce6-e69e-42ad-a356-9802f8750db4-kube-api-access-dcp9q\") pod \"ovn-operator-controller-manager-884679f54-lktjh\" (UID: \"b6b15ce6-e69e-42ad-a356-9802f8750db4\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.679462 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrwbk\" (UniqueName: \"kubernetes.io/projected/23dabc24-9851-4232-8f79-56c8615246c7-kube-api-access-vrwbk\") pod \"manila-operator-controller-manager-55f864c847-wgxn7\" (UID: \"23dabc24-9851-4232-8f79-56c8615246c7\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.682084 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-22skw"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.684410 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpffq\" (UniqueName: \"kubernetes.io/projected/75cb9425-bd1a-4311-a85f-76eee943e0e9-kube-api-access-rpffq\") pod \"neutron-operator-controller-manager-767865f676-xwbd2\" (UID: \"75cb9425-bd1a-4311-a85f-76eee943e0e9\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.685416 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.685666 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.687805 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hbmdb" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.689619 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.690147 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.692320 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jdsdp" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.713932 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvtpc\" (UniqueName: \"kubernetes.io/projected/e5270e18-ac4b-4f4d-8a6d-085699034cfe-kube-api-access-bvtpc\") pod \"octavia-operator-controller-manager-5b9f45d989-vmmmv\" (UID: \"e5270e18-ac4b-4f4d-8a6d-085699034cfe\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.724488 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-lktjh"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.730181 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcfzx\" (UniqueName: \"kubernetes.io/projected/ff62aaff-bfdc-400e-b6ee-217356ba1a23-kube-api-access-gcfzx\") pod \"nova-operator-controller-manager-5d488d59fb-9dwg8\" (UID: \"ff62aaff-bfdc-400e-b6ee-217356ba1a23\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.755865 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-22skw"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.766233 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.775045 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.780423 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8zhh\" (UniqueName: \"kubernetes.io/projected/002615ba-1b17-467b-a536-7a631c5b434e-kube-api-access-t8zhh\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.780472 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.780510 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbk64\" (UniqueName: \"kubernetes.io/projected/6616ca24-82bf-405a-b77b-8617c65ec76b-kube-api-access-bbk64\") pod \"swift-operator-controller-manager-c674c5965-sh6jl\" (UID: \"6616ca24-82bf-405a-b77b-8617c65ec76b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.780548 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4glw\" (UniqueName: \"kubernetes.io/projected/f7f518fa-6e0e-431d-88f4-f835400eec2a-kube-api-access-l4glw\") pod \"placement-operator-controller-manager-5784578c99-22skw\" (UID: \"f7f518fa-6e0e-431d-88f4-f835400eec2a\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.780759 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcp9q\" (UniqueName: \"kubernetes.io/projected/b6b15ce6-e69e-42ad-a356-9802f8750db4-kube-api-access-dcp9q\") pod \"ovn-operator-controller-manager-884679f54-lktjh\" (UID: \"b6b15ce6-e69e-42ad-a356-9802f8750db4\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" Mar 20 17:31:33 crc kubenswrapper[4803]: E0320 17:31:33.780856 4803 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:33 crc kubenswrapper[4803]: E0320 17:31:33.782097 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert podName:002615ba-1b17-467b-a536-7a631c5b434e nodeName:}" failed. No retries permitted until 2026-03-20 17:31:34.282079863 +0000 UTC m=+904.193671933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5vccbg" (UID: "002615ba-1b17-467b-a536-7a631c5b434e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.794145 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.795329 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.798298 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-92zvj" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.802556 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcp9q\" (UniqueName: \"kubernetes.io/projected/b6b15ce6-e69e-42ad-a356-9802f8750db4-kube-api-access-dcp9q\") pod \"ovn-operator-controller-manager-884679f54-lktjh\" (UID: \"b6b15ce6-e69e-42ad-a356-9802f8750db4\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.807397 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.811156 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8zhh\" (UniqueName: \"kubernetes.io/projected/002615ba-1b17-467b-a536-7a631c5b434e-kube-api-access-t8zhh\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.816865 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.819802 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.824786 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.830618 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mwdx7" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.830650 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.853393 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.859211 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.864419 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.869943 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.870090 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.876446 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4ssc5" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.882832 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwcw2\" (UniqueName: \"kubernetes.io/projected/6409d065-33f9-4e12-806a-b805e4b6e0ea-kube-api-access-rwcw2\") pod \"test-operator-controller-manager-5c5cb9c4d7-b7znw\" (UID: \"6409d065-33f9-4e12-806a-b805e4b6e0ea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.882915 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98ktd\" (UniqueName: \"kubernetes.io/projected/76713a0c-ed94-4e45-a947-e05e3ef0d3d6-kube-api-access-98ktd\") pod \"telemetry-operator-controller-manager-d6b694c5-fgckh\" (UID: \"76713a0c-ed94-4e45-a947-e05e3ef0d3d6\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.883053 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbk64\" (UniqueName: \"kubernetes.io/projected/6616ca24-82bf-405a-b77b-8617c65ec76b-kube-api-access-bbk64\") pod \"swift-operator-controller-manager-c674c5965-sh6jl\" (UID: \"6616ca24-82bf-405a-b77b-8617c65ec76b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.883077 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4glw\" (UniqueName: \"kubernetes.io/projected/f7f518fa-6e0e-431d-88f4-f835400eec2a-kube-api-access-l4glw\") pod \"placement-operator-controller-manager-5784578c99-22skw\" (UID: \"f7f518fa-6e0e-431d-88f4-f835400eec2a\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.909147 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4glw\" (UniqueName: \"kubernetes.io/projected/f7f518fa-6e0e-431d-88f4-f835400eec2a-kube-api-access-l4glw\") pod \"placement-operator-controller-manager-5784578c99-22skw\" (UID: \"f7f518fa-6e0e-431d-88f4-f835400eec2a\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.920281 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbk64\" (UniqueName: \"kubernetes.io/projected/6616ca24-82bf-405a-b77b-8617c65ec76b-kube-api-access-bbk64\") pod \"swift-operator-controller-manager-c674c5965-sh6jl\" (UID: \"6616ca24-82bf-405a-b77b-8617c65ec76b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.921565 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.922755 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.925641 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.927648 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.930162 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-ztdtz" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.959267 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.972932 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.985027 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2"] Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.985835 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.985902 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vklmk\" (UniqueName: \"kubernetes.io/projected/a8a0f2f5-7910-44c7-969d-204a7d1327d9-kube-api-access-vklmk\") pod \"watcher-operator-controller-manager-6c4d75f7f9-dxp5d\" (UID: \"a8a0f2f5-7910-44c7-969d-204a7d1327d9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.985930 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.985951 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwcw2\" (UniqueName: \"kubernetes.io/projected/6409d065-33f9-4e12-806a-b805e4b6e0ea-kube-api-access-rwcw2\") pod \"test-operator-controller-manager-5c5cb9c4d7-b7znw\" (UID: \"6409d065-33f9-4e12-806a-b805e4b6e0ea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.985973 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfpg2\" (UniqueName: \"kubernetes.io/projected/36ca4d28-1feb-4c48-bba8-d078f85fc37f-kube-api-access-gfpg2\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:33 crc kubenswrapper[4803]: I0320 17:31:33.986009 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98ktd\" (UniqueName: \"kubernetes.io/projected/76713a0c-ed94-4e45-a947-e05e3ef0d3d6-kube-api-access-98ktd\") pod \"telemetry-operator-controller-manager-d6b694c5-fgckh\" (UID: \"76713a0c-ed94-4e45-a947-e05e3ef0d3d6\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.013726 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98ktd\" (UniqueName: \"kubernetes.io/projected/76713a0c-ed94-4e45-a947-e05e3ef0d3d6-kube-api-access-98ktd\") pod \"telemetry-operator-controller-manager-d6b694c5-fgckh\" (UID: \"76713a0c-ed94-4e45-a947-e05e3ef0d3d6\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.016321 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwcw2\" (UniqueName: \"kubernetes.io/projected/6409d065-33f9-4e12-806a-b805e4b6e0ea-kube-api-access-rwcw2\") pod \"test-operator-controller-manager-5c5cb9c4d7-b7znw\" (UID: \"6409d065-33f9-4e12-806a-b805e4b6e0ea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.035287 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.036937 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" Mar 20 17:31:34 crc kubenswrapper[4803]: W0320 17:31:34.040447 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524f49fa_3d73_4089_aa72_cbcfdfbed979.slice/crio-e948f06ea00a2a6535c5f12da03a74b0082469ef731708a88999b6115d41191b WatchSource:0}: Error finding container e948f06ea00a2a6535c5f12da03a74b0082469ef731708a88999b6115d41191b: Status 404 returned error can't find the container with id e948f06ea00a2a6535c5f12da03a74b0082469ef731708a88999b6115d41191b Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.087315 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.087405 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.087447 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vklmk\" (UniqueName: \"kubernetes.io/projected/a8a0f2f5-7910-44c7-969d-204a7d1327d9-kube-api-access-vklmk\") pod \"watcher-operator-controller-manager-6c4d75f7f9-dxp5d\" (UID: \"a8a0f2f5-7910-44c7-969d-204a7d1327d9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.087481 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.087512 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfpg2\" (UniqueName: \"kubernetes.io/projected/36ca4d28-1feb-4c48-bba8-d078f85fc37f-kube-api-access-gfpg2\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.088023 4803 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.088071 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert podName:0dbcced3-01dd-45bb-8f6f-1733abb4f7db nodeName:}" failed. No retries permitted until 2026-03-20 17:31:35.088055963 +0000 UTC m=+904.999648033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-qxc8h" (UID: "0dbcced3-01dd-45bb-8f6f-1733abb4f7db") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.088333 4803 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.088356 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:34.588349251 +0000 UTC m=+904.499941321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "webhook-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.088501 4803 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.088590 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:34.588579368 +0000 UTC m=+904.500171438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "metrics-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.106421 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vklmk\" (UniqueName: \"kubernetes.io/projected/a8a0f2f5-7910-44c7-969d-204a7d1327d9-kube-api-access-vklmk\") pod \"watcher-operator-controller-manager-6c4d75f7f9-dxp5d\" (UID: \"a8a0f2f5-7910-44c7-969d-204a7d1327d9\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.108148 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfpg2\" (UniqueName: \"kubernetes.io/projected/36ca4d28-1feb-4c48-bba8-d078f85fc37f-kube-api-access-gfpg2\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.125516 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.163068 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.169274 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6"] Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.191893 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k"] Mar 20 17:31:34 crc kubenswrapper[4803]: W0320 17:31:34.215078 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3eec30a_a8b9_40b8_a786_16a339efe990.slice/crio-fe605579e27c9d3148673c9876d6ca4b3004b20e739ca55a465518f80b20bd19 WatchSource:0}: Error finding container fe605579e27c9d3148673c9876d6ca4b3004b20e739ca55a465518f80b20bd19: Status 404 returned error can't find the container with id fe605579e27c9d3148673c9876d6ca4b3004b20e739ca55a465518f80b20bd19 Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.224910 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.294219 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.294373 4803 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.294418 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert podName:002615ba-1b17-467b-a536-7a631c5b434e nodeName:}" failed. No retries permitted until 2026-03-20 17:31:35.294405715 +0000 UTC m=+905.205997785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5vccbg" (UID: "002615ba-1b17-467b-a536-7a631c5b434e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.358482 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" event={"ID":"cfe324bc-b8c8-4971-b1d5-ed9df499771f","Type":"ContainerStarted","Data":"47c1ce8dc28ebe815d7df12aa647ddfd320a423d1a9a52a53ae2bed44680d484"} Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.360623 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" event={"ID":"524f49fa-3d73-4089-aa72-cbcfdfbed979","Type":"ContainerStarted","Data":"e948f06ea00a2a6535c5f12da03a74b0082469ef731708a88999b6115d41191b"} Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.361762 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" event={"ID":"b3eec30a-a8b9-40b8-a786-16a339efe990","Type":"ContainerStarted","Data":"fe605579e27c9d3148673c9876d6ca4b3004b20e739ca55a465518f80b20bd19"} Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.425896 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h"] Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.455222 4803 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.455289 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.511283 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-b2flt" Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.512418 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7"] Mar 20 17:31:34 crc kubenswrapper[4803]: W0320 17:31:34.516031 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10bf5d23_ea4e_4c19_bad8_f2cb15d93cc5.slice/crio-522187c72689795f98bd89404b0a14565e3fc90889e99539c00c9bcf467a9d90 WatchSource:0}: Error finding container 522187c72689795f98bd89404b0a14565e3fc90889e99539c00c9bcf467a9d90: Status 404 returned error can't find the container with id 522187c72689795f98bd89404b0a14565e3fc90889e99539c00c9bcf467a9d90 Mar 20 17:31:34 crc kubenswrapper[4803]: W0320 17:31:34.520110 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23dabc24_9851_4232_8f79_56c8615246c7.slice/crio-f6281b84a1185b7705058a98369841bb23db36afc925f152dfb9b7e624bb5c8f WatchSource:0}: Error finding container f6281b84a1185b7705058a98369841bb23db36afc925f152dfb9b7e624bb5c8f: Status 404 returned error can't find the container with id f6281b84a1185b7705058a98369841bb23db36afc925f152dfb9b7e624bb5c8f Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.535894 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n"] Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.548764 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594"] Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.599563 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.600047 4803 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.600160 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.600281 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:35.600260452 +0000 UTC m=+905.511852522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "webhook-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.600305 4803 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.600421 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:35.600413046 +0000 UTC m=+905.512005116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "metrics-server-cert" not found Mar 20 17:31:34 crc kubenswrapper[4803]: W0320 17:31:34.755290 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a5c980d_5ccf_4e9d_9687_3119240ecc15.slice/crio-dcf8038f51bb57b5b2f05a93af970232d3b4feb6766813860d97cafd78681012 WatchSource:0}: Error finding container dcf8038f51bb57b5b2f05a93af970232d3b4feb6766813860d97cafd78681012: Status 404 returned error can't find the container with id dcf8038f51bb57b5b2f05a93af970232d3b4feb6766813860d97cafd78681012 Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.756899 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9"] Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.761739 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8"] Mar 20 17:31:34 crc kubenswrapper[4803]: W0320 17:31:34.764694 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb01812c6_e35c_4699_8b7b_192a425bf0ce.slice/crio-216f4793990be2f972a7ef25a6c1abaf48738d22268f91d9de5584a319197987 WatchSource:0}: Error finding container 216f4793990be2f972a7ef25a6c1abaf48738d22268f91d9de5584a319197987: Status 404 returned error can't find the container with id 216f4793990be2f972a7ef25a6c1abaf48738d22268f91d9de5584a319197987 Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.864185 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8"] Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.864216 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl"] Mar 20 17:31:34 crc kubenswrapper[4803]: W0320 17:31:34.871510 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff62aaff_bfdc_400e_b6ee_217356ba1a23.slice/crio-c6be1dedbd9f5f387a53226126f775c5ce6b71c19e03c7ba55e9d1b5608082ce WatchSource:0}: Error finding container c6be1dedbd9f5f387a53226126f775c5ce6b71c19e03c7ba55e9d1b5608082ce: Status 404 returned error can't find the container with id c6be1dedbd9f5f387a53226126f775c5ce6b71c19e03c7ba55e9d1b5608082ce Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.872795 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2"] Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.968642 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-lktjh"] Mar 20 17:31:34 crc kubenswrapper[4803]: I0320 17:31:34.983831 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d"] Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.998720 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4glw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-22skw_openstack-operators(f7f518fa-6e0e-431d-88f4-f835400eec2a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:31:34 crc kubenswrapper[4803]: E0320 17:31:34.999931 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" podUID="f7f518fa-6e0e-431d-88f4-f835400eec2a" Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.000447 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-22skw"] Mar 20 17:31:35 crc kubenswrapper[4803]: W0320 17:31:35.001707 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5270e18_ac4b_4f4d_8a6d_085699034cfe.slice/crio-ed4ff28fb7581825b856dce5539e252f6164ca97bcbc94858eadf7000908b4d9 WatchSource:0}: Error finding container ed4ff28fb7581825b856dce5539e252f6164ca97bcbc94858eadf7000908b4d9: Status 404 returned error can't find the container with id ed4ff28fb7581825b856dce5539e252f6164ca97bcbc94858eadf7000908b4d9 Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.003920 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bvtpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-vmmmv_openstack-operators(e5270e18-ac4b-4f4d-8a6d-085699034cfe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.005976 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" podUID="e5270e18-ac4b-4f4d-8a6d-085699034cfe" Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.006437 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv"] Mar 20 17:31:35 crc kubenswrapper[4803]: W0320 17:31:35.007988 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6409d065_33f9_4e12_806a_b805e4b6e0ea.slice/crio-68e24ae1966797b2632bc068cabd5d6bcede054252176269d6104079d871bd16 WatchSource:0}: Error finding container 68e24ae1966797b2632bc068cabd5d6bcede054252176269d6104079d871bd16: Status 404 returned error can't find the container with id 68e24ae1966797b2632bc068cabd5d6bcede054252176269d6104079d871bd16 Mar 20 17:31:35 crc kubenswrapper[4803]: W0320 17:31:35.009308 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76713a0c_ed94_4e45_a947_e05e3ef0d3d6.slice/crio-d1c02d1a2c227297d31320896acaec0c27b69811b413ccff2b8c3aa844e34dfa WatchSource:0}: Error finding container d1c02d1a2c227297d31320896acaec0c27b69811b413ccff2b8c3aa844e34dfa: Status 404 returned error can't find the container with id d1c02d1a2c227297d31320896acaec0c27b69811b413ccff2b8c3aa844e34dfa Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.010996 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rwcw2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-b7znw_openstack-operators(6409d065-33f9-4e12-806a-b805e4b6e0ea): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.012067 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-98ktd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-fgckh_openstack-operators(76713a0c-ed94-4e45-a947-e05e3ef0d3d6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.012143 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" podUID="6409d065-33f9-4e12-806a-b805e4b6e0ea" Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.013203 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" podUID="76713a0c-ed94-4e45-a947-e05e3ef0d3d6" Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.015541 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw"] Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.020600 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh"] Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.113083 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.113224 4803 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.113269 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert podName:0dbcced3-01dd-45bb-8f6f-1733abb4f7db nodeName:}" failed. No retries permitted until 2026-03-20 17:31:37.113255724 +0000 UTC m=+907.024847784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-qxc8h" (UID: "0dbcced3-01dd-45bb-8f6f-1733abb4f7db") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.193926 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25"] Mar 20 17:31:35 crc kubenswrapper[4803]: W0320 17:31:35.206435 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefd7fa6_fd31_4b28_a7e9_1a4e630070fe.slice/crio-3128ba29c7c00c25bfa4f6947710063b95360c3f52c7493acc8109c93c7f7e5b WatchSource:0}: Error finding container 3128ba29c7c00c25bfa4f6947710063b95360c3f52c7493acc8109c93c7f7e5b: Status 404 returned error can't find the container with id 3128ba29c7c00c25bfa4f6947710063b95360c3f52c7493acc8109c93c7f7e5b Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.315053 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.315296 4803 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.315396 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert podName:002615ba-1b17-467b-a536-7a631c5b434e nodeName:}" failed. No retries permitted until 2026-03-20 17:31:37.315375152 +0000 UTC m=+907.226967222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5vccbg" (UID: "002615ba-1b17-467b-a536-7a631c5b434e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.371213 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" event={"ID":"a8a0f2f5-7910-44c7-969d-204a7d1327d9","Type":"ContainerStarted","Data":"b42df6eca40318af5dbba872a9dec8ed632366a6ceac4c011926b5f409e8788f"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.373175 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" event={"ID":"25fa5c58-b9f7-4cd8-b1c4-41c190df40f1","Type":"ContainerStarted","Data":"61396516e4d999f4a811ba3412616abe9ff724bd715c1e934ed8896f6ed56862"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.374895 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" event={"ID":"6616ca24-82bf-405a-b77b-8617c65ec76b","Type":"ContainerStarted","Data":"c8d6418f1565160b8989f9083430abe4cc6019e5e799b617427036f0e8e6e3bd"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.376408 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" event={"ID":"b01812c6-e35c-4699-8b7b-192a425bf0ce","Type":"ContainerStarted","Data":"216f4793990be2f972a7ef25a6c1abaf48738d22268f91d9de5584a319197987"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.378961 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" event={"ID":"e5270e18-ac4b-4f4d-8a6d-085699034cfe","Type":"ContainerStarted","Data":"ed4ff28fb7581825b856dce5539e252f6164ca97bcbc94858eadf7000908b4d9"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.381942 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" event={"ID":"f7f518fa-6e0e-431d-88f4-f835400eec2a","Type":"ContainerStarted","Data":"8cbdcd560f4e7e78ae37fc2ae47b68a5e47de9fca9404bb839605758b3a25897"} Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.382189 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" podUID="e5270e18-ac4b-4f4d-8a6d-085699034cfe" Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.383370 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" podUID="f7f518fa-6e0e-431d-88f4-f835400eec2a" Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.384008 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" event={"ID":"6409d065-33f9-4e12-806a-b805e4b6e0ea","Type":"ContainerStarted","Data":"68e24ae1966797b2632bc068cabd5d6bcede054252176269d6104079d871bd16"} Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.385281 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" podUID="6409d065-33f9-4e12-806a-b805e4b6e0ea" Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.385372 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" event={"ID":"76713a0c-ed94-4e45-a947-e05e3ef0d3d6","Type":"ContainerStarted","Data":"d1c02d1a2c227297d31320896acaec0c27b69811b413ccff2b8c3aa844e34dfa"} Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.387147 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" podUID="76713a0c-ed94-4e45-a947-e05e3ef0d3d6" Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.388848 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" event={"ID":"5ea17b5b-3363-4c08-a7e3-52cbb4cb5616","Type":"ContainerStarted","Data":"a01dc1c8253d180fba732651c7fb62fd140cef6603bd76f0479d430b6d041660"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.390411 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" event={"ID":"ff62aaff-bfdc-400e-b6ee-217356ba1a23","Type":"ContainerStarted","Data":"c6be1dedbd9f5f387a53226126f775c5ce6b71c19e03c7ba55e9d1b5608082ce"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.392087 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" event={"ID":"23dabc24-9851-4232-8f79-56c8615246c7","Type":"ContainerStarted","Data":"f6281b84a1185b7705058a98369841bb23db36afc925f152dfb9b7e624bb5c8f"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.404552 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" event={"ID":"10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5","Type":"ContainerStarted","Data":"522187c72689795f98bd89404b0a14565e3fc90889e99539c00c9bcf467a9d90"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.406340 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" event={"ID":"75cb9425-bd1a-4311-a85f-76eee943e0e9","Type":"ContainerStarted","Data":"50f2509835b5f9cd8d4c92c4ed201d292d2be0cefa881804d6fdf84f5057c480"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.407591 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" event={"ID":"fefd7fa6-fd31-4b28-a7e9-1a4e630070fe","Type":"ContainerStarted","Data":"3128ba29c7c00c25bfa4f6947710063b95360c3f52c7493acc8109c93c7f7e5b"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.410510 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" event={"ID":"b6b15ce6-e69e-42ad-a356-9802f8750db4","Type":"ContainerStarted","Data":"ee0a63a7851c0743314bf7e46e08d16150372047ba82324b1801f99f35ecb409"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.411858 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" event={"ID":"7a5c980d-5ccf-4e9d-9687-3119240ecc15","Type":"ContainerStarted","Data":"dcf8038f51bb57b5b2f05a93af970232d3b4feb6766813860d97cafd78681012"} Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.619082 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:35 crc kubenswrapper[4803]: I0320 17:31:35.619163 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.619389 4803 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.619440 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:37.619426775 +0000 UTC m=+907.531018845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "metrics-server-cert" not found Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.619750 4803 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:31:35 crc kubenswrapper[4803]: E0320 17:31:35.619778 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:37.619771075 +0000 UTC m=+907.531363145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "webhook-server-cert" not found Mar 20 17:31:36 crc kubenswrapper[4803]: E0320 17:31:36.420574 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" podUID="f7f518fa-6e0e-431d-88f4-f835400eec2a" Mar 20 17:31:36 crc kubenswrapper[4803]: E0320 17:31:36.421875 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" podUID="6409d065-33f9-4e12-806a-b805e4b6e0ea" Mar 20 17:31:36 crc kubenswrapper[4803]: E0320 17:31:36.421907 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" podUID="e5270e18-ac4b-4f4d-8a6d-085699034cfe" Mar 20 17:31:36 crc kubenswrapper[4803]: E0320 17:31:36.422003 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" podUID="76713a0c-ed94-4e45-a947-e05e3ef0d3d6" Mar 20 17:31:37 crc kubenswrapper[4803]: I0320 17:31:37.141030 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:37 crc kubenswrapper[4803]: E0320 17:31:37.141277 4803 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:37 crc kubenswrapper[4803]: E0320 17:31:37.141335 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert podName:0dbcced3-01dd-45bb-8f6f-1733abb4f7db nodeName:}" failed. No retries permitted until 2026-03-20 17:31:41.141318992 +0000 UTC m=+911.052911072 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-qxc8h" (UID: "0dbcced3-01dd-45bb-8f6f-1733abb4f7db") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:37 crc kubenswrapper[4803]: I0320 17:31:37.344085 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:37 crc kubenswrapper[4803]: E0320 17:31:37.344359 4803 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:37 crc kubenswrapper[4803]: E0320 17:31:37.344420 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert podName:002615ba-1b17-467b-a536-7a631c5b434e nodeName:}" failed. No retries permitted until 2026-03-20 17:31:41.344401429 +0000 UTC m=+911.255993499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5vccbg" (UID: "002615ba-1b17-467b-a536-7a631c5b434e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:37 crc kubenswrapper[4803]: I0320 17:31:37.648895 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:37 crc kubenswrapper[4803]: I0320 17:31:37.649024 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:37 crc kubenswrapper[4803]: E0320 17:31:37.649063 4803 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:31:37 crc kubenswrapper[4803]: E0320 17:31:37.649137 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:41.649118862 +0000 UTC m=+911.560711042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "metrics-server-cert" not found Mar 20 17:31:37 crc kubenswrapper[4803]: E0320 17:31:37.649176 4803 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:31:37 crc kubenswrapper[4803]: E0320 17:31:37.649267 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:41.649244265 +0000 UTC m=+911.560836425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "webhook-server-cert" not found Mar 20 17:31:41 crc kubenswrapper[4803]: I0320 17:31:41.212921 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:41 crc kubenswrapper[4803]: E0320 17:31:41.213600 4803 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:41 crc kubenswrapper[4803]: E0320 17:31:41.214187 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert podName:0dbcced3-01dd-45bb-8f6f-1733abb4f7db nodeName:}" failed. No retries permitted until 2026-03-20 17:31:49.214167326 +0000 UTC m=+919.125759396 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert") pod "infra-operator-controller-manager-6f8b7f6fdf-qxc8h" (UID: "0dbcced3-01dd-45bb-8f6f-1733abb4f7db") : secret "infra-operator-webhook-server-cert" not found Mar 20 17:31:41 crc kubenswrapper[4803]: I0320 17:31:41.416235 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:41 crc kubenswrapper[4803]: E0320 17:31:41.416423 4803 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:41 crc kubenswrapper[4803]: E0320 17:31:41.416496 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert podName:002615ba-1b17-467b-a536-7a631c5b434e nodeName:}" failed. No retries permitted until 2026-03-20 17:31:49.416479696 +0000 UTC m=+919.328071766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5vccbg" (UID: "002615ba-1b17-467b-a536-7a631c5b434e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:41 crc kubenswrapper[4803]: I0320 17:31:41.720597 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:41 crc kubenswrapper[4803]: I0320 17:31:41.720928 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:41 crc kubenswrapper[4803]: E0320 17:31:41.721070 4803 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:31:41 crc kubenswrapper[4803]: E0320 17:31:41.721125 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:49.721107374 +0000 UTC m=+919.632699454 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "metrics-server-cert" not found Mar 20 17:31:41 crc kubenswrapper[4803]: E0320 17:31:41.721482 4803 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:31:41 crc kubenswrapper[4803]: E0320 17:31:41.721511 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:31:49.721502385 +0000 UTC m=+919.633094465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "webhook-server-cert" not found Mar 20 17:31:46 crc kubenswrapper[4803]: E0320 17:31:46.449358 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 17:31:46 crc kubenswrapper[4803]: E0320 17:31:46.449986 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-94sxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-k8594_openstack-operators(5ea17b5b-3363-4c08-a7e3-52cbb4cb5616): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:31:46 crc kubenswrapper[4803]: E0320 17:31:46.451243 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" podUID="5ea17b5b-3363-4c08-a7e3-52cbb4cb5616" Mar 20 17:31:46 crc kubenswrapper[4803]: E0320 17:31:46.494120 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" podUID="5ea17b5b-3363-4c08-a7e3-52cbb4cb5616" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.316872 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z4d9z"] Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.318187 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.328986 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4d9z"] Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.444594 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-catalog-content\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.444635 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2jtd\" (UniqueName: \"kubernetes.io/projected/72522871-8ebd-405a-9dcb-8323822a9151-kube-api-access-s2jtd\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.444805 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-utilities\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.546492 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-utilities\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.546565 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-catalog-content\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.546584 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2jtd\" (UniqueName: \"kubernetes.io/projected/72522871-8ebd-405a-9dcb-8323822a9151-kube-api-access-s2jtd\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.547621 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-utilities\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.547965 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-catalog-content\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.568343 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2jtd\" (UniqueName: \"kubernetes.io/projected/72522871-8ebd-405a-9dcb-8323822a9151-kube-api-access-s2jtd\") pod \"redhat-marketplace-z4d9z\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: I0320 17:31:48.667285 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:31:48 crc kubenswrapper[4803]: E0320 17:31:48.763134 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 17:31:48 crc kubenswrapper[4803]: E0320 17:31:48.763272 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gcfzx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-9dwg8_openstack-operators(ff62aaff-bfdc-400e-b6ee-217356ba1a23): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:31:48 crc kubenswrapper[4803]: E0320 17:31:48.764350 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" podUID="ff62aaff-bfdc-400e-b6ee-217356ba1a23" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.043087 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4d9z"] Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.257582 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.265636 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0dbcced3-01dd-45bb-8f6f-1733abb4f7db-cert\") pod \"infra-operator-controller-manager-6f8b7f6fdf-qxc8h\" (UID: \"0dbcced3-01dd-45bb-8f6f-1733abb4f7db\") " pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.461249 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:31:49 crc kubenswrapper[4803]: E0320 17:31:49.461421 4803 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:49 crc kubenswrapper[4803]: E0320 17:31:49.461465 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert podName:002615ba-1b17-467b-a536-7a631c5b434e nodeName:}" failed. No retries permitted until 2026-03-20 17:32:05.461451375 +0000 UTC m=+935.373043445 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5vccbg" (UID: "002615ba-1b17-467b-a536-7a631c5b434e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.537913 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.554063 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" event={"ID":"524f49fa-3d73-4089-aa72-cbcfdfbed979","Type":"ContainerStarted","Data":"1c98e5221f32c3a326662211e4cf86e0dfbcc706b2dd5adc1a61f1f2706dc58d"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.554242 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.562851 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" event={"ID":"fefd7fa6-fd31-4b28-a7e9-1a4e630070fe","Type":"ContainerStarted","Data":"47f4e04b93cc98f1709284c825ced6c5a3f010e55853ce3ec711d482c325e74f"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.562976 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.565803 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" event={"ID":"75cb9425-bd1a-4311-a85f-76eee943e0e9","Type":"ContainerStarted","Data":"a98efd7bf7f9e48d6cf9d478f72178008111e3d8603b06b0642f11bfdd99ba19"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.566100 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.568153 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" event={"ID":"b01812c6-e35c-4699-8b7b-192a425bf0ce","Type":"ContainerStarted","Data":"1d1b8249dba8fdbf9a858547038c5915b24b06a3f1cbcedb2f79fb375506989c"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.568267 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.581440 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" event={"ID":"a8a0f2f5-7910-44c7-969d-204a7d1327d9","Type":"ContainerStarted","Data":"e2f06c08c92f5c3b2a6404e22ce5450952babbaef51e14fc2a8e1229d898152f"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.581469 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.587997 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" podStartSLOduration=2.922023223 podStartE2EDuration="16.58797817s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.05449519 +0000 UTC m=+903.966087260" lastFinishedPulling="2026-03-20 17:31:47.720450137 +0000 UTC m=+917.632042207" observedRunningTime="2026-03-20 17:31:49.586150127 +0000 UTC m=+919.497742197" watchObservedRunningTime="2026-03-20 17:31:49.58797817 +0000 UTC m=+919.499570240" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.599620 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" event={"ID":"6616ca24-82bf-405a-b77b-8617c65ec76b","Type":"ContainerStarted","Data":"73c545710825c2ac75d054495beed455c82453135b89f498523ed10a4c2a86e2"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.599734 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.607842 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" event={"ID":"b6b15ce6-e69e-42ad-a356-9802f8750db4","Type":"ContainerStarted","Data":"9fd98d1b144c187cef78615bf0e0862ab3da8631679cb9b2f18a4620a552db59"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.608125 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.618476 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" podStartSLOduration=2.714335624 podStartE2EDuration="16.618458468s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.857384051 +0000 UTC m=+904.768976131" lastFinishedPulling="2026-03-20 17:31:48.761506905 +0000 UTC m=+918.673098975" observedRunningTime="2026-03-20 17:31:49.615588645 +0000 UTC m=+919.527180715" watchObservedRunningTime="2026-03-20 17:31:49.618458468 +0000 UTC m=+919.530050528" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.625316 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" event={"ID":"7a5c980d-5ccf-4e9d-9687-3119240ecc15","Type":"ContainerStarted","Data":"7533e5e2a8bcdc1355445c2160c6e4b3216ce2f325fc826f2150ecc68d716b96"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.625462 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.628270 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" event={"ID":"25fa5c58-b9f7-4cd8-b1c4-41c190df40f1","Type":"ContainerStarted","Data":"2f70e76a23bfb4834f55c5ba510e7bbf1b762ba03b432e88431faf8b9af12b9c"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.628880 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.629932 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" event={"ID":"cfe324bc-b8c8-4971-b1d5-ed9df499771f","Type":"ContainerStarted","Data":"a4dd964a5fefc0310d07ee562e37e5170f672a254de4db1450426d235cdd058a"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.630243 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.631019 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" event={"ID":"23dabc24-9851-4232-8f79-56c8615246c7","Type":"ContainerStarted","Data":"01d17f79716317c47cab00348790e5cc901949696ea4a10d13d0ee580829af36"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.631351 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.632123 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" event={"ID":"10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5","Type":"ContainerStarted","Data":"01c86ffcac90187d6c395bce9658a7aaa28e1ca862688b0e19aa74a262a57439"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.632431 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.644553 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" event={"ID":"b3eec30a-a8b9-40b8-a786-16a339efe990","Type":"ContainerStarted","Data":"42c9bc47b933b60a8db4d8f7e13db3b73a299e007a05e0a5efe19c155c4f8bfe"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.645183 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.650909 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4d9z" event={"ID":"72522871-8ebd-405a-9dcb-8323822a9151","Type":"ContainerStarted","Data":"e6f39a5c444cbca9a1c7b39a853c9b42162eda29c5ea9af6b63ba72cea71f6a9"} Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.650948 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4d9z" event={"ID":"72522871-8ebd-405a-9dcb-8323822a9151","Type":"ContainerStarted","Data":"d3502605ed70f3e5daa9f90381d941f218dd1f6fdf90e141ff01da54eb26cbb0"} Mar 20 17:31:49 crc kubenswrapper[4803]: E0320 17:31:49.652720 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" podUID="ff62aaff-bfdc-400e-b6ee-217356ba1a23" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.667249 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" podStartSLOduration=2.672700831 podStartE2EDuration="16.667234203s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.767022155 +0000 UTC m=+904.678614225" lastFinishedPulling="2026-03-20 17:31:48.761555507 +0000 UTC m=+918.673147597" observedRunningTime="2026-03-20 17:31:49.637600169 +0000 UTC m=+919.549192259" watchObservedRunningTime="2026-03-20 17:31:49.667234203 +0000 UTC m=+919.578826273" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.669495 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" podStartSLOduration=2.875236981 podStartE2EDuration="16.669489278s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.987512862 +0000 UTC m=+904.899104932" lastFinishedPulling="2026-03-20 17:31:48.781765159 +0000 UTC m=+918.693357229" observedRunningTime="2026-03-20 17:31:49.665416121 +0000 UTC m=+919.577008201" watchObservedRunningTime="2026-03-20 17:31:49.669489278 +0000 UTC m=+919.581081348" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.696718 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" podStartSLOduration=3.11310576 podStartE2EDuration="16.696704183s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:35.211747717 +0000 UTC m=+905.123339787" lastFinishedPulling="2026-03-20 17:31:48.79534614 +0000 UTC m=+918.706938210" observedRunningTime="2026-03-20 17:31:49.692942464 +0000 UTC m=+919.604534544" watchObservedRunningTime="2026-03-20 17:31:49.696704183 +0000 UTC m=+919.608296253" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.765206 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.765293 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:31:49 crc kubenswrapper[4803]: E0320 17:31:49.765881 4803 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 17:31:49 crc kubenswrapper[4803]: E0320 17:31:49.765960 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:32:05.765941828 +0000 UTC m=+935.677533898 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "webhook-server-cert" not found Mar 20 17:31:49 crc kubenswrapper[4803]: E0320 17:31:49.766081 4803 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 17:31:49 crc kubenswrapper[4803]: E0320 17:31:49.766130 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs podName:36ca4d28-1feb-4c48-bba8-d078f85fc37f nodeName:}" failed. No retries permitted until 2026-03-20 17:32:05.766117043 +0000 UTC m=+935.677709113 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs") pod "openstack-operator-controller-manager-56f44579c8-sht64" (UID: "36ca4d28-1feb-4c48-bba8-d078f85fc37f") : secret "metrics-server-cert" not found Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.793655 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" podStartSLOduration=3.614775033 podStartE2EDuration="16.793638456s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.54111473 +0000 UTC m=+904.452706800" lastFinishedPulling="2026-03-20 17:31:47.719978153 +0000 UTC m=+917.631570223" observedRunningTime="2026-03-20 17:31:49.772640501 +0000 UTC m=+919.684232591" watchObservedRunningTime="2026-03-20 17:31:49.793638456 +0000 UTC m=+919.705230526" Mar 20 17:31:49 crc kubenswrapper[4803]: I0320 17:31:49.989508 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" podStartSLOduration=3.226661024 podStartE2EDuration="16.989487409s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.981172616 +0000 UTC m=+904.892764686" lastFinishedPulling="2026-03-20 17:31:48.743999001 +0000 UTC m=+918.655591071" observedRunningTime="2026-03-20 17:31:49.963172121 +0000 UTC m=+919.874764211" watchObservedRunningTime="2026-03-20 17:31:49.989487409 +0000 UTC m=+919.901079489" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.001379 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" podStartSLOduration=2.763260656 podStartE2EDuration="17.001363062s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.519719043 +0000 UTC m=+904.431311123" lastFinishedPulling="2026-03-20 17:31:48.757821469 +0000 UTC m=+918.669413529" observedRunningTime="2026-03-20 17:31:50.000912039 +0000 UTC m=+919.912504109" watchObservedRunningTime="2026-03-20 17:31:50.001363062 +0000 UTC m=+919.912955132" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.198268 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" podStartSLOduration=2.682707415 podStartE2EDuration="17.198248585s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.221471559 +0000 UTC m=+904.133063629" lastFinishedPulling="2026-03-20 17:31:48.737012719 +0000 UTC m=+918.648604799" observedRunningTime="2026-03-20 17:31:50.116745276 +0000 UTC m=+920.028337346" watchObservedRunningTime="2026-03-20 17:31:50.198248585 +0000 UTC m=+920.109840655" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.202626 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h"] Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.233118 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" podStartSLOduration=3.008783668 podStartE2EDuration="17.233102479s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.537212715 +0000 UTC m=+904.448804785" lastFinishedPulling="2026-03-20 17:31:48.761531526 +0000 UTC m=+918.673123596" observedRunningTime="2026-03-20 17:31:50.229851326 +0000 UTC m=+920.141443426" watchObservedRunningTime="2026-03-20 17:31:50.233102479 +0000 UTC m=+920.144694549" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.317410 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" podStartSLOduration=3.31579303 podStartE2EDuration="17.317390578s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.759879806 +0000 UTC m=+904.671471876" lastFinishedPulling="2026-03-20 17:31:48.761477354 +0000 UTC m=+918.673069424" observedRunningTime="2026-03-20 17:31:50.316860393 +0000 UTC m=+920.228452473" watchObservedRunningTime="2026-03-20 17:31:50.317390578 +0000 UTC m=+920.228982648" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.333987 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" podStartSLOduration=3.461789789 podStartE2EDuration="17.333965776s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.871297689 +0000 UTC m=+904.782889769" lastFinishedPulling="2026-03-20 17:31:48.743473676 +0000 UTC m=+918.655065756" observedRunningTime="2026-03-20 17:31:50.27374557 +0000 UTC m=+920.185337660" watchObservedRunningTime="2026-03-20 17:31:50.333965776 +0000 UTC m=+920.245557846" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.382696 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" podStartSLOduration=2.8534117390000002 podStartE2EDuration="17.382675049s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.232805761 +0000 UTC m=+904.144397831" lastFinishedPulling="2026-03-20 17:31:48.762069071 +0000 UTC m=+918.673661141" observedRunningTime="2026-03-20 17:31:50.354975181 +0000 UTC m=+920.266567261" watchObservedRunningTime="2026-03-20 17:31:50.382675049 +0000 UTC m=+920.294267119" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.660008 4803 generic.go:334] "Generic (PLEG): container finished" podID="72522871-8ebd-405a-9dcb-8323822a9151" containerID="e6f39a5c444cbca9a1c7b39a853c9b42162eda29c5ea9af6b63ba72cea71f6a9" exitCode=0 Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.660072 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4d9z" event={"ID":"72522871-8ebd-405a-9dcb-8323822a9151","Type":"ContainerDied","Data":"e6f39a5c444cbca9a1c7b39a853c9b42162eda29c5ea9af6b63ba72cea71f6a9"} Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.660101 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4d9z" event={"ID":"72522871-8ebd-405a-9dcb-8323822a9151","Type":"ContainerStarted","Data":"c530a880676c11cd3a8e68b60a1e97720c979aa9175dd420fdb7b46224ab8f4b"} Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.663296 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" event={"ID":"0dbcced3-01dd-45bb-8f6f-1733abb4f7db","Type":"ContainerStarted","Data":"11426ae553083fad45681bacc08c432b6af514aefa587e321fab8137a1294fae"} Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.702379 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tjpq8"] Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.703911 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.722322 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjpq8"] Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.800171 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k8dz\" (UniqueName: \"kubernetes.io/projected/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-kube-api-access-5k8dz\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.800554 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-utilities\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.800572 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-catalog-content\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.901865 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-utilities\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.902328 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-utilities\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.902368 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-catalog-content\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.902429 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k8dz\" (UniqueName: \"kubernetes.io/projected/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-kube-api-access-5k8dz\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.902502 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-catalog-content\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:50 crc kubenswrapper[4803]: I0320 17:31:50.932618 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k8dz\" (UniqueName: \"kubernetes.io/projected/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-kube-api-access-5k8dz\") pod \"community-operators-tjpq8\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:51 crc kubenswrapper[4803]: I0320 17:31:51.037351 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:31:51 crc kubenswrapper[4803]: I0320 17:31:51.553437 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tjpq8"] Mar 20 17:31:51 crc kubenswrapper[4803]: W0320 17:31:51.580923 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49be0b5d_e64d_4e04_93c8_568ad98e5d8b.slice/crio-625945d0a5500eaa67e20a5129fca4f4fe4d86e7d22bf3ecaf6b0b4b0c537604 WatchSource:0}: Error finding container 625945d0a5500eaa67e20a5129fca4f4fe4d86e7d22bf3ecaf6b0b4b0c537604: Status 404 returned error can't find the container with id 625945d0a5500eaa67e20a5129fca4f4fe4d86e7d22bf3ecaf6b0b4b0c537604 Mar 20 17:31:51 crc kubenswrapper[4803]: I0320 17:31:51.673427 4803 generic.go:334] "Generic (PLEG): container finished" podID="72522871-8ebd-405a-9dcb-8323822a9151" containerID="c530a880676c11cd3a8e68b60a1e97720c979aa9175dd420fdb7b46224ab8f4b" exitCode=0 Mar 20 17:31:51 crc kubenswrapper[4803]: I0320 17:31:51.673487 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4d9z" event={"ID":"72522871-8ebd-405a-9dcb-8323822a9151","Type":"ContainerDied","Data":"c530a880676c11cd3a8e68b60a1e97720c979aa9175dd420fdb7b46224ab8f4b"} Mar 20 17:31:51 crc kubenswrapper[4803]: I0320 17:31:51.685136 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjpq8" event={"ID":"49be0b5d-e64d-4e04-93c8-568ad98e5d8b","Type":"ContainerStarted","Data":"625945d0a5500eaa67e20a5129fca4f4fe4d86e7d22bf3ecaf6b0b4b0c537604"} Mar 20 17:31:52 crc kubenswrapper[4803]: I0320 17:31:52.701691 4803 generic.go:334] "Generic (PLEG): container finished" podID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerID="7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc" exitCode=0 Mar 20 17:31:52 crc kubenswrapper[4803]: I0320 17:31:52.701986 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjpq8" event={"ID":"49be0b5d-e64d-4e04-93c8-568ad98e5d8b","Type":"ContainerDied","Data":"7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc"} Mar 20 17:31:54 crc kubenswrapper[4803]: I0320 17:31:54.040962 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-sh6jl" Mar 20 17:31:54 crc kubenswrapper[4803]: I0320 17:31:54.229912 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dxp5d" Mar 20 17:31:54 crc kubenswrapper[4803]: I0320 17:31:54.458341 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-8gf25" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.748146 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4d9z" event={"ID":"72522871-8ebd-405a-9dcb-8323822a9151","Type":"ContainerStarted","Data":"0b2767e577cb938fc13be147b170a48f9578281d7709f1398cef293ec32e2198"} Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.750549 4803 generic.go:334] "Generic (PLEG): container finished" podID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerID="b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45" exitCode=0 Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.750627 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjpq8" event={"ID":"49be0b5d-e64d-4e04-93c8-568ad98e5d8b","Type":"ContainerDied","Data":"b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45"} Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.753579 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" event={"ID":"e5270e18-ac4b-4f4d-8a6d-085699034cfe","Type":"ContainerStarted","Data":"0db1a2bb8921cb2c8361877c3c4f892a4a923c7974d2b6d5b24159abf66e2193"} Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.753745 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.756145 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" event={"ID":"0dbcced3-01dd-45bb-8f6f-1733abb4f7db","Type":"ContainerStarted","Data":"2c583c1b3caf718cfbf2422fb0178556c85fb97dbd64312ffbbd30605cbbd212"} Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.756275 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.757187 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" event={"ID":"f7f518fa-6e0e-431d-88f4-f835400eec2a","Type":"ContainerStarted","Data":"2e67eb40fb6100b77ed897dc138635ea3951e0a5fe8a9bbbfaac389f8ae57773"} Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.757329 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.758305 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" event={"ID":"6409d065-33f9-4e12-806a-b805e4b6e0ea","Type":"ContainerStarted","Data":"59df6b2bebd9c57aab90a95d4701037dd4a6986854bb5cd0f0fbb29a0f3e0bb6"} Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.758479 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.759228 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" event={"ID":"76713a0c-ed94-4e45-a947-e05e3ef0d3d6","Type":"ContainerStarted","Data":"9f042330c3c041b99d8f629daedb80271929302b3ff9fc45894839655b5a23b4"} Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.759341 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.770112 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z4d9z" podStartSLOduration=2.445248422 podStartE2EDuration="10.770094785s" podCreationTimestamp="2026-03-20 17:31:48 +0000 UTC" firstStartedPulling="2026-03-20 17:31:49.652915211 +0000 UTC m=+919.564507271" lastFinishedPulling="2026-03-20 17:31:57.977761524 +0000 UTC m=+927.889353634" observedRunningTime="2026-03-20 17:31:58.764670169 +0000 UTC m=+928.676262259" watchObservedRunningTime="2026-03-20 17:31:58.770094785 +0000 UTC m=+928.681686875" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.799211 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" podStartSLOduration=17.942389297 podStartE2EDuration="25.799192473s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:50.222398101 +0000 UTC m=+920.133990171" lastFinishedPulling="2026-03-20 17:31:58.079201267 +0000 UTC m=+927.990793347" observedRunningTime="2026-03-20 17:31:58.793616483 +0000 UTC m=+928.705208573" watchObservedRunningTime="2026-03-20 17:31:58.799192473 +0000 UTC m=+928.710784533" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.820761 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" podStartSLOduration=2.858436229 podStartE2EDuration="25.820748445s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:35.011933137 +0000 UTC m=+904.923525207" lastFinishedPulling="2026-03-20 17:31:57.974245313 +0000 UTC m=+927.885837423" observedRunningTime="2026-03-20 17:31:58.81747435 +0000 UTC m=+928.729066420" watchObservedRunningTime="2026-03-20 17:31:58.820748445 +0000 UTC m=+928.732340505" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.834779 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" podStartSLOduration=2.8713350699999998 podStartE2EDuration="25.834751738s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:35.010854425 +0000 UTC m=+904.922446485" lastFinishedPulling="2026-03-20 17:31:57.974271073 +0000 UTC m=+927.885863153" observedRunningTime="2026-03-20 17:31:58.832273237 +0000 UTC m=+928.743865307" watchObservedRunningTime="2026-03-20 17:31:58.834751738 +0000 UTC m=+928.746343808" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.854827 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" podStartSLOduration=2.876388322 podStartE2EDuration="25.854806426s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:35.00384365 +0000 UTC m=+904.915435720" lastFinishedPulling="2026-03-20 17:31:57.982261714 +0000 UTC m=+927.893853824" observedRunningTime="2026-03-20 17:31:58.847483745 +0000 UTC m=+928.759075825" watchObservedRunningTime="2026-03-20 17:31:58.854806426 +0000 UTC m=+928.766398506" Mar 20 17:31:58 crc kubenswrapper[4803]: I0320 17:31:58.869626 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" podStartSLOduration=2.893554805 podStartE2EDuration="25.869608663s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.998617267 +0000 UTC m=+904.910209337" lastFinishedPulling="2026-03-20 17:31:57.974671085 +0000 UTC m=+927.886263195" observedRunningTime="2026-03-20 17:31:58.868613574 +0000 UTC m=+928.780205654" watchObservedRunningTime="2026-03-20 17:31:58.869608663 +0000 UTC m=+928.781200733" Mar 20 17:31:59 crc kubenswrapper[4803]: I0320 17:31:59.769218 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjpq8" event={"ID":"49be0b5d-e64d-4e04-93c8-568ad98e5d8b","Type":"ContainerStarted","Data":"316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183"} Mar 20 17:31:59 crc kubenswrapper[4803]: I0320 17:31:59.792479 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tjpq8" podStartSLOduration=4.02244558 podStartE2EDuration="9.792458395s" podCreationTimestamp="2026-03-20 17:31:50 +0000 UTC" firstStartedPulling="2026-03-20 17:31:53.587461806 +0000 UTC m=+923.499053876" lastFinishedPulling="2026-03-20 17:31:59.357474621 +0000 UTC m=+929.269066691" observedRunningTime="2026-03-20 17:31:59.788375117 +0000 UTC m=+929.699967187" watchObservedRunningTime="2026-03-20 17:31:59.792458395 +0000 UTC m=+929.704050475" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.143259 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567132-9dbds"] Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.144086 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-9dbds" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.145429 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.146041 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.146089 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.152110 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-9dbds"] Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.267240 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg7n2\" (UniqueName: \"kubernetes.io/projected/caee5b1b-0dda-4525-bf7c-ac589fb5f730-kube-api-access-pg7n2\") pod \"auto-csr-approver-29567132-9dbds\" (UID: \"caee5b1b-0dda-4525-bf7c-ac589fb5f730\") " pod="openshift-infra/auto-csr-approver-29567132-9dbds" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.368560 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg7n2\" (UniqueName: \"kubernetes.io/projected/caee5b1b-0dda-4525-bf7c-ac589fb5f730-kube-api-access-pg7n2\") pod \"auto-csr-approver-29567132-9dbds\" (UID: \"caee5b1b-0dda-4525-bf7c-ac589fb5f730\") " pod="openshift-infra/auto-csr-approver-29567132-9dbds" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.386000 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg7n2\" (UniqueName: \"kubernetes.io/projected/caee5b1b-0dda-4525-bf7c-ac589fb5f730-kube-api-access-pg7n2\") pod \"auto-csr-approver-29567132-9dbds\" (UID: \"caee5b1b-0dda-4525-bf7c-ac589fb5f730\") " pod="openshift-infra/auto-csr-approver-29567132-9dbds" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.457558 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-9dbds" Mar 20 17:32:00 crc kubenswrapper[4803]: I0320 17:32:00.954857 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-9dbds"] Mar 20 17:32:00 crc kubenswrapper[4803]: W0320 17:32:00.977353 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaee5b1b_0dda_4525_bf7c_ac589fb5f730.slice/crio-74b44924415351fe08897b6dd6779479a8868a278d340c4c6f41b070a2efd945 WatchSource:0}: Error finding container 74b44924415351fe08897b6dd6779479a8868a278d340c4c6f41b070a2efd945: Status 404 returned error can't find the container with id 74b44924415351fe08897b6dd6779479a8868a278d340c4c6f41b070a2efd945 Mar 20 17:32:01 crc kubenswrapper[4803]: I0320 17:32:01.038543 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:32:01 crc kubenswrapper[4803]: I0320 17:32:01.038601 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:32:01 crc kubenswrapper[4803]: I0320 17:32:01.794873 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-9dbds" event={"ID":"caee5b1b-0dda-4525-bf7c-ac589fb5f730","Type":"ContainerStarted","Data":"74b44924415351fe08897b6dd6779479a8868a278d340c4c6f41b070a2efd945"} Mar 20 17:32:02 crc kubenswrapper[4803]: I0320 17:32:02.093925 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-tjpq8" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="registry-server" probeResult="failure" output=< Mar 20 17:32:02 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:32:02 crc kubenswrapper[4803]: > Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.470192 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-7w5w2" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.496475 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-b7sz6" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.546756 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-lxz9k" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.632314 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b5q6h" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.632377 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-smx6n" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.669089 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-cqwk9" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.690083 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-wgxn7" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.780095 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-6pfw8" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.833234 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-xwbd2" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.870479 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-vmmmv" Mar 20 17:32:03 crc kubenswrapper[4803]: I0320 17:32:03.977693 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lktjh" Mar 20 17:32:04 crc kubenswrapper[4803]: I0320 17:32:04.038231 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-22skw" Mar 20 17:32:04 crc kubenswrapper[4803]: I0320 17:32:04.129241 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fgckh" Mar 20 17:32:04 crc kubenswrapper[4803]: I0320 17:32:04.166405 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-b7znw" Mar 20 17:32:05 crc kubenswrapper[4803]: I0320 17:32:05.498280 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:32:05 crc kubenswrapper[4803]: I0320 17:32:05.508652 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/002615ba-1b17-467b-a536-7a631c5b434e-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5vccbg\" (UID: \"002615ba-1b17-467b-a536-7a631c5b434e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:32:05 crc kubenswrapper[4803]: I0320 17:32:05.736092 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:32:05 crc kubenswrapper[4803]: I0320 17:32:05.803769 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:32:05 crc kubenswrapper[4803]: I0320 17:32:05.803870 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:32:05 crc kubenswrapper[4803]: I0320 17:32:05.809823 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-webhook-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:32:05 crc kubenswrapper[4803]: I0320 17:32:05.810710 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36ca4d28-1feb-4c48-bba8-d078f85fc37f-metrics-certs\") pod \"openstack-operator-controller-manager-56f44579c8-sht64\" (UID: \"36ca4d28-1feb-4c48-bba8-d078f85fc37f\") " pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:32:06 crc kubenswrapper[4803]: I0320 17:32:06.011300 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg"] Mar 20 17:32:06 crc kubenswrapper[4803]: W0320 17:32:06.022207 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod002615ba_1b17_467b_a536_7a631c5b434e.slice/crio-7c51cc3d9de61e83756a3f8a234297f6682b133995119e3d92474e93f2baab95 WatchSource:0}: Error finding container 7c51cc3d9de61e83756a3f8a234297f6682b133995119e3d92474e93f2baab95: Status 404 returned error can't find the container with id 7c51cc3d9de61e83756a3f8a234297f6682b133995119e3d92474e93f2baab95 Mar 20 17:32:06 crc kubenswrapper[4803]: I0320 17:32:06.060163 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:32:06 crc kubenswrapper[4803]: I0320 17:32:06.544867 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64"] Mar 20 17:32:06 crc kubenswrapper[4803]: I0320 17:32:06.865205 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" event={"ID":"36ca4d28-1feb-4c48-bba8-d078f85fc37f","Type":"ContainerStarted","Data":"b5f1caa6a6061424d6643511139b9144294f35a899e4698dfa9012b7fae1c0cd"} Mar 20 17:32:06 crc kubenswrapper[4803]: I0320 17:32:06.865273 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" event={"ID":"002615ba-1b17-467b-a536-7a631c5b434e","Type":"ContainerStarted","Data":"7c51cc3d9de61e83756a3f8a234297f6682b133995119e3d92474e93f2baab95"} Mar 20 17:32:08 crc kubenswrapper[4803]: I0320 17:32:08.245984 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:32:08 crc kubenswrapper[4803]: I0320 17:32:08.246435 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:32:08 crc kubenswrapper[4803]: I0320 17:32:08.675744 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:32:08 crc kubenswrapper[4803]: I0320 17:32:08.675811 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:32:08 crc kubenswrapper[4803]: I0320 17:32:08.744560 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:32:08 crc kubenswrapper[4803]: I0320 17:32:08.940658 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:32:09 crc kubenswrapper[4803]: I0320 17:32:09.017618 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4d9z"] Mar 20 17:32:09 crc kubenswrapper[4803]: I0320 17:32:09.546793 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6f8b7f6fdf-qxc8h" Mar 20 17:32:10 crc kubenswrapper[4803]: I0320 17:32:10.890210 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z4d9z" podUID="72522871-8ebd-405a-9dcb-8323822a9151" containerName="registry-server" containerID="cri-o://0b2767e577cb938fc13be147b170a48f9578281d7709f1398cef293ec32e2198" gracePeriod=2 Mar 20 17:32:11 crc kubenswrapper[4803]: I0320 17:32:11.106979 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:32:11 crc kubenswrapper[4803]: I0320 17:32:11.190150 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:32:12 crc kubenswrapper[4803]: I0320 17:32:12.389945 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjpq8"] Mar 20 17:32:12 crc kubenswrapper[4803]: I0320 17:32:12.905660 4803 generic.go:334] "Generic (PLEG): container finished" podID="72522871-8ebd-405a-9dcb-8323822a9151" containerID="0b2767e577cb938fc13be147b170a48f9578281d7709f1398cef293ec32e2198" exitCode=0 Mar 20 17:32:12 crc kubenswrapper[4803]: I0320 17:32:12.905710 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4d9z" event={"ID":"72522871-8ebd-405a-9dcb-8323822a9151","Type":"ContainerDied","Data":"0b2767e577cb938fc13be147b170a48f9578281d7709f1398cef293ec32e2198"} Mar 20 17:32:12 crc kubenswrapper[4803]: I0320 17:32:12.907457 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" event={"ID":"36ca4d28-1feb-4c48-bba8-d078f85fc37f","Type":"ContainerStarted","Data":"27e3905029b11492fbfb240dda433bccd90290faca5e7a65c06df5977e40c617"} Mar 20 17:32:12 crc kubenswrapper[4803]: I0320 17:32:12.907662 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tjpq8" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="registry-server" containerID="cri-o://316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183" gracePeriod=2 Mar 20 17:32:12 crc kubenswrapper[4803]: I0320 17:32:12.907818 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.890340 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.916488 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k8dz\" (UniqueName: \"kubernetes.io/projected/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-kube-api-access-5k8dz\") pod \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.916581 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-utilities\") pod \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.916633 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-catalog-content\") pod \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\" (UID: \"49be0b5d-e64d-4e04-93c8-568ad98e5d8b\") " Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.917482 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-utilities" (OuterVolumeSpecName: "utilities") pod "49be0b5d-e64d-4e04-93c8-568ad98e5d8b" (UID: "49be0b5d-e64d-4e04-93c8-568ad98e5d8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.923881 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" podStartSLOduration=40.923853314 podStartE2EDuration="40.923853314s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:32:12.976935758 +0000 UTC m=+942.888527828" watchObservedRunningTime="2026-03-20 17:32:13.923853314 +0000 UTC m=+943.835445394" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.928641 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-kube-api-access-5k8dz" (OuterVolumeSpecName: "kube-api-access-5k8dz") pod "49be0b5d-e64d-4e04-93c8-568ad98e5d8b" (UID: "49be0b5d-e64d-4e04-93c8-568ad98e5d8b"). InnerVolumeSpecName "kube-api-access-5k8dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.929505 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z4d9z" event={"ID":"72522871-8ebd-405a-9dcb-8323822a9151","Type":"ContainerDied","Data":"d3502605ed70f3e5daa9f90381d941f218dd1f6fdf90e141ff01da54eb26cbb0"} Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.929563 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3502605ed70f3e5daa9f90381d941f218dd1f6fdf90e141ff01da54eb26cbb0" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.932653 4803 generic.go:334] "Generic (PLEG): container finished" podID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerID="316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183" exitCode=0 Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.933633 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjpq8" event={"ID":"49be0b5d-e64d-4e04-93c8-568ad98e5d8b","Type":"ContainerDied","Data":"316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183"} Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.933668 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tjpq8" event={"ID":"49be0b5d-e64d-4e04-93c8-568ad98e5d8b","Type":"ContainerDied","Data":"625945d0a5500eaa67e20a5129fca4f4fe4d86e7d22bf3ecaf6b0b4b0c537604"} Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.933695 4803 scope.go:117] "RemoveContainer" containerID="316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.933698 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tjpq8" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.942251 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:32:13 crc kubenswrapper[4803]: I0320 17:32:13.980171 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49be0b5d-e64d-4e04-93c8-568ad98e5d8b" (UID: "49be0b5d-e64d-4e04-93c8-568ad98e5d8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.002851 4803 scope.go:117] "RemoveContainer" containerID="b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.017744 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2jtd\" (UniqueName: \"kubernetes.io/projected/72522871-8ebd-405a-9dcb-8323822a9151-kube-api-access-s2jtd\") pod \"72522871-8ebd-405a-9dcb-8323822a9151\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.017959 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-catalog-content\") pod \"72522871-8ebd-405a-9dcb-8323822a9151\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.018037 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-utilities\") pod \"72522871-8ebd-405a-9dcb-8323822a9151\" (UID: \"72522871-8ebd-405a-9dcb-8323822a9151\") " Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.018736 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k8dz\" (UniqueName: \"kubernetes.io/projected/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-kube-api-access-5k8dz\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.018756 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.018766 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49be0b5d-e64d-4e04-93c8-568ad98e5d8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.019125 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-utilities" (OuterVolumeSpecName: "utilities") pod "72522871-8ebd-405a-9dcb-8323822a9151" (UID: "72522871-8ebd-405a-9dcb-8323822a9151"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.025849 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72522871-8ebd-405a-9dcb-8323822a9151-kube-api-access-s2jtd" (OuterVolumeSpecName: "kube-api-access-s2jtd") pod "72522871-8ebd-405a-9dcb-8323822a9151" (UID: "72522871-8ebd-405a-9dcb-8323822a9151"). InnerVolumeSpecName "kube-api-access-s2jtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.026026 4803 scope.go:117] "RemoveContainer" containerID="7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.046421 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72522871-8ebd-405a-9dcb-8323822a9151" (UID: "72522871-8ebd-405a-9dcb-8323822a9151"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.049640 4803 scope.go:117] "RemoveContainer" containerID="316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183" Mar 20 17:32:14 crc kubenswrapper[4803]: E0320 17:32:14.050162 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183\": container with ID starting with 316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183 not found: ID does not exist" containerID="316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.050193 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183"} err="failed to get container status \"316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183\": rpc error: code = NotFound desc = could not find container \"316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183\": container with ID starting with 316db77b52d5820355fa0a4d75576d35a9e160baaef32d181180f5fe473d6183 not found: ID does not exist" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.050215 4803 scope.go:117] "RemoveContainer" containerID="b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45" Mar 20 17:32:14 crc kubenswrapper[4803]: E0320 17:32:14.050516 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45\": container with ID starting with b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45 not found: ID does not exist" containerID="b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.050555 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45"} err="failed to get container status \"b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45\": rpc error: code = NotFound desc = could not find container \"b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45\": container with ID starting with b42d049c3499911e11fb4f9e786c2ba133a7f29fdc0f60bfe72829d84393ea45 not found: ID does not exist" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.050568 4803 scope.go:117] "RemoveContainer" containerID="7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc" Mar 20 17:32:14 crc kubenswrapper[4803]: E0320 17:32:14.050873 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc\": container with ID starting with 7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc not found: ID does not exist" containerID="7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.050890 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc"} err="failed to get container status \"7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc\": rpc error: code = NotFound desc = could not find container \"7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc\": container with ID starting with 7f9bdd716732a9c4c5dc5f9ac3c8771b47b0cc46cda14933d338c0df0215e9fc not found: ID does not exist" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.119798 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.119841 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72522871-8ebd-405a-9dcb-8323822a9151-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.119855 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2jtd\" (UniqueName: \"kubernetes.io/projected/72522871-8ebd-405a-9dcb-8323822a9151-kube-api-access-s2jtd\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.265275 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tjpq8"] Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.271935 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tjpq8"] Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.863076 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" path="/var/lib/kubelet/pods/49be0b5d-e64d-4e04-93c8-568ad98e5d8b/volumes" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.943900 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" event={"ID":"5ea17b5b-3363-4c08-a7e3-52cbb4cb5616","Type":"ContainerStarted","Data":"5e431f9cba18d754aa79d65dd1be15380b85c47224ef68d0be79ef2b70760f86"} Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.944489 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.948433 4803 generic.go:334] "Generic (PLEG): container finished" podID="caee5b1b-0dda-4525-bf7c-ac589fb5f730" containerID="56b5f2c3a0afdb80fd2c803f9cb50e733f01d14610e6d0680e39257648da9d04" exitCode=0 Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.948573 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-9dbds" event={"ID":"caee5b1b-0dda-4525-bf7c-ac589fb5f730","Type":"ContainerDied","Data":"56b5f2c3a0afdb80fd2c803f9cb50e733f01d14610e6d0680e39257648da9d04"} Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.951083 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" event={"ID":"002615ba-1b17-467b-a536-7a631c5b434e","Type":"ContainerStarted","Data":"b666df1c4213fc0a931ca82e51b22fba9cfcc704aef7c382176e3e491c6774b3"} Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.951227 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.953128 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" event={"ID":"ff62aaff-bfdc-400e-b6ee-217356ba1a23","Type":"ContainerStarted","Data":"e15bfcab87b0e70caefa58e1d97b245626d7664bfeed2dedcf4346dbcd76451a"} Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.953165 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z4d9z" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.953711 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" Mar 20 17:32:14 crc kubenswrapper[4803]: I0320 17:32:14.976266 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" podStartSLOduration=2.6800908679999997 podStartE2EDuration="41.976239228s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.537112263 +0000 UTC m=+904.448704333" lastFinishedPulling="2026-03-20 17:32:13.833260623 +0000 UTC m=+943.744852693" observedRunningTime="2026-03-20 17:32:14.967513956 +0000 UTC m=+944.879106116" watchObservedRunningTime="2026-03-20 17:32:14.976239228 +0000 UTC m=+944.887831338" Mar 20 17:32:15 crc kubenswrapper[4803]: I0320 17:32:15.017417 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4d9z"] Mar 20 17:32:15 crc kubenswrapper[4803]: I0320 17:32:15.026098 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z4d9z"] Mar 20 17:32:15 crc kubenswrapper[4803]: I0320 17:32:15.058650 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" podStartSLOduration=34.164025467 podStartE2EDuration="42.058605031s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:32:06.024308627 +0000 UTC m=+935.935900697" lastFinishedPulling="2026-03-20 17:32:13.918888151 +0000 UTC m=+943.830480261" observedRunningTime="2026-03-20 17:32:15.045214635 +0000 UTC m=+944.956806745" watchObservedRunningTime="2026-03-20 17:32:15.058605031 +0000 UTC m=+944.970197141" Mar 20 17:32:15 crc kubenswrapper[4803]: I0320 17:32:15.070853 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" podStartSLOduration=3.113970409 podStartE2EDuration="42.070830614s" podCreationTimestamp="2026-03-20 17:31:33 +0000 UTC" firstStartedPulling="2026-03-20 17:31:34.877104849 +0000 UTC m=+904.788696919" lastFinishedPulling="2026-03-20 17:32:13.833965044 +0000 UTC m=+943.745557124" observedRunningTime="2026-03-20 17:32:15.069866606 +0000 UTC m=+944.981458716" watchObservedRunningTime="2026-03-20 17:32:15.070830614 +0000 UTC m=+944.982422724" Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.070125 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-56f44579c8-sht64" Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.300570 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-9dbds" Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.353310 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg7n2\" (UniqueName: \"kubernetes.io/projected/caee5b1b-0dda-4525-bf7c-ac589fb5f730-kube-api-access-pg7n2\") pod \"caee5b1b-0dda-4525-bf7c-ac589fb5f730\" (UID: \"caee5b1b-0dda-4525-bf7c-ac589fb5f730\") " Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.358186 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caee5b1b-0dda-4525-bf7c-ac589fb5f730-kube-api-access-pg7n2" (OuterVolumeSpecName: "kube-api-access-pg7n2") pod "caee5b1b-0dda-4525-bf7c-ac589fb5f730" (UID: "caee5b1b-0dda-4525-bf7c-ac589fb5f730"). InnerVolumeSpecName "kube-api-access-pg7n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.455298 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg7n2\" (UniqueName: \"kubernetes.io/projected/caee5b1b-0dda-4525-bf7c-ac589fb5f730-kube-api-access-pg7n2\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.862700 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72522871-8ebd-405a-9dcb-8323822a9151" path="/var/lib/kubelet/pods/72522871-8ebd-405a-9dcb-8323822a9151/volumes" Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.976953 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567132-9dbds" event={"ID":"caee5b1b-0dda-4525-bf7c-ac589fb5f730","Type":"ContainerDied","Data":"74b44924415351fe08897b6dd6779479a8868a278d340c4c6f41b070a2efd945"} Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.977032 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b44924415351fe08897b6dd6779479a8868a278d340c4c6f41b070a2efd945" Mar 20 17:32:16 crc kubenswrapper[4803]: I0320 17:32:16.977039 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567132-9dbds" Mar 20 17:32:17 crc kubenswrapper[4803]: I0320 17:32:17.375702 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-v4lng"] Mar 20 17:32:17 crc kubenswrapper[4803]: I0320 17:32:17.385025 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567126-v4lng"] Mar 20 17:32:18 crc kubenswrapper[4803]: I0320 17:32:18.862829 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aa173fa-44b8-4141-8238-32c03c99cbbc" path="/var/lib/kubelet/pods/5aa173fa-44b8-4141-8238-32c03c99cbbc/volumes" Mar 20 17:32:23 crc kubenswrapper[4803]: I0320 17:32:23.674645 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-k8594" Mar 20 17:32:23 crc kubenswrapper[4803]: I0320 17:32:23.859693 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-9dwg8" Mar 20 17:32:25 crc kubenswrapper[4803]: I0320 17:32:25.745899 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5vccbg" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.409963 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mtgll"] Mar 20 17:32:29 crc kubenswrapper[4803]: E0320 17:32:29.410727 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72522871-8ebd-405a-9dcb-8323822a9151" containerName="extract-utilities" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.410750 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="72522871-8ebd-405a-9dcb-8323822a9151" containerName="extract-utilities" Mar 20 17:32:29 crc kubenswrapper[4803]: E0320 17:32:29.410774 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72522871-8ebd-405a-9dcb-8323822a9151" containerName="registry-server" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.410788 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="72522871-8ebd-405a-9dcb-8323822a9151" containerName="registry-server" Mar 20 17:32:29 crc kubenswrapper[4803]: E0320 17:32:29.410807 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="extract-utilities" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.410819 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="extract-utilities" Mar 20 17:32:29 crc kubenswrapper[4803]: E0320 17:32:29.410836 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="extract-content" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.410849 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="extract-content" Mar 20 17:32:29 crc kubenswrapper[4803]: E0320 17:32:29.410894 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72522871-8ebd-405a-9dcb-8323822a9151" containerName="extract-content" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.410905 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="72522871-8ebd-405a-9dcb-8323822a9151" containerName="extract-content" Mar 20 17:32:29 crc kubenswrapper[4803]: E0320 17:32:29.410923 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="registry-server" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.410935 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="registry-server" Mar 20 17:32:29 crc kubenswrapper[4803]: E0320 17:32:29.410955 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caee5b1b-0dda-4525-bf7c-ac589fb5f730" containerName="oc" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.410967 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="caee5b1b-0dda-4525-bf7c-ac589fb5f730" containerName="oc" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.411189 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="49be0b5d-e64d-4e04-93c8-568ad98e5d8b" containerName="registry-server" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.411214 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="72522871-8ebd-405a-9dcb-8323822a9151" containerName="registry-server" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.411233 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="caee5b1b-0dda-4525-bf7c-ac589fb5f730" containerName="oc" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.412972 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.416406 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mtgll"] Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.462757 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-catalog-content\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.462808 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-utilities\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.462848 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cw2j\" (UniqueName: \"kubernetes.io/projected/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-kube-api-access-5cw2j\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.563946 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-catalog-content\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.564001 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-utilities\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.564039 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cw2j\" (UniqueName: \"kubernetes.io/projected/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-kube-api-access-5cw2j\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.564439 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-catalog-content\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.564900 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-utilities\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.584409 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cw2j\" (UniqueName: \"kubernetes.io/projected/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-kube-api-access-5cw2j\") pod \"certified-operators-mtgll\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:29 crc kubenswrapper[4803]: I0320 17:32:29.745068 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:30 crc kubenswrapper[4803]: I0320 17:32:30.245742 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mtgll"] Mar 20 17:32:30 crc kubenswrapper[4803]: W0320 17:32:30.248824 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7486f533_2fa2_4e7b_98f2_1ab4121b3bf7.slice/crio-0cc50fe138db16a691d9ed3967d2173353ecff963297c776ff9c35b27097eab7 WatchSource:0}: Error finding container 0cc50fe138db16a691d9ed3967d2173353ecff963297c776ff9c35b27097eab7: Status 404 returned error can't find the container with id 0cc50fe138db16a691d9ed3967d2173353ecff963297c776ff9c35b27097eab7 Mar 20 17:32:31 crc kubenswrapper[4803]: I0320 17:32:31.126642 4803 generic.go:334] "Generic (PLEG): container finished" podID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerID="7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d" exitCode=0 Mar 20 17:32:31 crc kubenswrapper[4803]: I0320 17:32:31.126768 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtgll" event={"ID":"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7","Type":"ContainerDied","Data":"7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d"} Mar 20 17:32:31 crc kubenswrapper[4803]: I0320 17:32:31.127139 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtgll" event={"ID":"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7","Type":"ContainerStarted","Data":"0cc50fe138db16a691d9ed3967d2173353ecff963297c776ff9c35b27097eab7"} Mar 20 17:32:33 crc kubenswrapper[4803]: I0320 17:32:33.146490 4803 generic.go:334] "Generic (PLEG): container finished" podID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerID="3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01" exitCode=0 Mar 20 17:32:33 crc kubenswrapper[4803]: I0320 17:32:33.146596 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtgll" event={"ID":"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7","Type":"ContainerDied","Data":"3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01"} Mar 20 17:32:34 crc kubenswrapper[4803]: I0320 17:32:34.159204 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtgll" event={"ID":"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7","Type":"ContainerStarted","Data":"e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10"} Mar 20 17:32:34 crc kubenswrapper[4803]: I0320 17:32:34.195621 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mtgll" podStartSLOduration=2.697483846 podStartE2EDuration="5.19559592s" podCreationTimestamp="2026-03-20 17:32:29 +0000 UTC" firstStartedPulling="2026-03-20 17:32:31.130765546 +0000 UTC m=+961.042357646" lastFinishedPulling="2026-03-20 17:32:33.62887761 +0000 UTC m=+963.540469720" observedRunningTime="2026-03-20 17:32:34.184075369 +0000 UTC m=+964.095667519" watchObservedRunningTime="2026-03-20 17:32:34.19559592 +0000 UTC m=+964.107188030" Mar 20 17:32:38 crc kubenswrapper[4803]: I0320 17:32:38.246141 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:32:38 crc kubenswrapper[4803]: I0320 17:32:38.246417 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:32:39 crc kubenswrapper[4803]: I0320 17:32:39.746092 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:39 crc kubenswrapper[4803]: I0320 17:32:39.746570 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:39 crc kubenswrapper[4803]: I0320 17:32:39.815106 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:40 crc kubenswrapper[4803]: I0320 17:32:40.262942 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:40 crc kubenswrapper[4803]: I0320 17:32:40.312042 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mtgll"] Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.224845 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mtgll" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerName="registry-server" containerID="cri-o://e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10" gracePeriod=2 Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.647197 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.668326 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cw2j\" (UniqueName: \"kubernetes.io/projected/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-kube-api-access-5cw2j\") pod \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.668363 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-catalog-content\") pod \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.668454 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-utilities\") pod \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\" (UID: \"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7\") " Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.669588 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-utilities" (OuterVolumeSpecName: "utilities") pod "7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" (UID: "7486f533-2fa2-4e7b-98f2-1ab4121b3bf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.678680 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-kube-api-access-5cw2j" (OuterVolumeSpecName: "kube-api-access-5cw2j") pod "7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" (UID: "7486f533-2fa2-4e7b-98f2-1ab4121b3bf7"). InnerVolumeSpecName "kube-api-access-5cw2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.753357 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" (UID: "7486f533-2fa2-4e7b-98f2-1ab4121b3bf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.770195 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cw2j\" (UniqueName: \"kubernetes.io/projected/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-kube-api-access-5cw2j\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.770228 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:42 crc kubenswrapper[4803]: I0320 17:32:42.770240 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.235229 4803 generic.go:334] "Generic (PLEG): container finished" podID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerID="e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10" exitCode=0 Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.235279 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtgll" event={"ID":"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7","Type":"ContainerDied","Data":"e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10"} Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.235308 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mtgll" event={"ID":"7486f533-2fa2-4e7b-98f2-1ab4121b3bf7","Type":"ContainerDied","Data":"0cc50fe138db16a691d9ed3967d2173353ecff963297c776ff9c35b27097eab7"} Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.235328 4803 scope.go:117] "RemoveContainer" containerID="e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.235481 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mtgll" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.258976 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mtgll"] Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.259352 4803 scope.go:117] "RemoveContainer" containerID="3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.269020 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mtgll"] Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.285706 4803 scope.go:117] "RemoveContainer" containerID="7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.311283 4803 scope.go:117] "RemoveContainer" containerID="e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10" Mar 20 17:32:43 crc kubenswrapper[4803]: E0320 17:32:43.311767 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10\": container with ID starting with e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10 not found: ID does not exist" containerID="e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.311824 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10"} err="failed to get container status \"e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10\": rpc error: code = NotFound desc = could not find container \"e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10\": container with ID starting with e06d59cf59cac4a967449d5f28409f5fb38b86e38ec98fb6616dfe600db2ef10 not found: ID does not exist" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.311873 4803 scope.go:117] "RemoveContainer" containerID="3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01" Mar 20 17:32:43 crc kubenswrapper[4803]: E0320 17:32:43.312184 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01\": container with ID starting with 3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01 not found: ID does not exist" containerID="3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.312218 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01"} err="failed to get container status \"3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01\": rpc error: code = NotFound desc = could not find container \"3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01\": container with ID starting with 3f3d7c991a3d270c8d80cac2c84cd70ff8278e9849ff1d5ca6cd0d2a3f13be01 not found: ID does not exist" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.312239 4803 scope.go:117] "RemoveContainer" containerID="7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d" Mar 20 17:32:43 crc kubenswrapper[4803]: E0320 17:32:43.312476 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d\": container with ID starting with 7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d not found: ID does not exist" containerID="7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d" Mar 20 17:32:43 crc kubenswrapper[4803]: I0320 17:32:43.312514 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d"} err="failed to get container status \"7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d\": rpc error: code = NotFound desc = could not find container \"7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d\": container with ID starting with 7532ada3bc9217ad6999619c83099280c9cd54b41b578851f8f0e4df174f8a2d not found: ID does not exist" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.304409 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-58jcj"] Mar 20 17:32:44 crc kubenswrapper[4803]: E0320 17:32:44.304972 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerName="registry-server" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.304985 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerName="registry-server" Mar 20 17:32:44 crc kubenswrapper[4803]: E0320 17:32:44.304997 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerName="extract-utilities" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.305004 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerName="extract-utilities" Mar 20 17:32:44 crc kubenswrapper[4803]: E0320 17:32:44.305024 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerName="extract-content" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.305032 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerName="extract-content" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.305165 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" containerName="registry-server" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.305813 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.307627 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2cj4g" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.310815 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.310988 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.311048 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.321159 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-58jcj"] Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.367948 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-79s5t"] Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.368968 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.371266 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.384225 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-79s5t"] Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.499323 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-config\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.499395 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f341d0c-6709-4465-8679-513a8e659fb1-config\") pod \"dnsmasq-dns-675f4bcbfc-58jcj\" (UID: \"0f341d0c-6709-4465-8679-513a8e659fb1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.499667 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.499931 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bksd\" (UniqueName: \"kubernetes.io/projected/aad9a413-82e1-48d0-8d36-6e522215d0b6-kube-api-access-8bksd\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.500027 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjwk5\" (UniqueName: \"kubernetes.io/projected/0f341d0c-6709-4465-8679-513a8e659fb1-kube-api-access-jjwk5\") pod \"dnsmasq-dns-675f4bcbfc-58jcj\" (UID: \"0f341d0c-6709-4465-8679-513a8e659fb1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.601413 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-config\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.601507 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f341d0c-6709-4465-8679-513a8e659fb1-config\") pod \"dnsmasq-dns-675f4bcbfc-58jcj\" (UID: \"0f341d0c-6709-4465-8679-513a8e659fb1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.601616 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.601763 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bksd\" (UniqueName: \"kubernetes.io/projected/aad9a413-82e1-48d0-8d36-6e522215d0b6-kube-api-access-8bksd\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.601827 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjwk5\" (UniqueName: \"kubernetes.io/projected/0f341d0c-6709-4465-8679-513a8e659fb1-kube-api-access-jjwk5\") pod \"dnsmasq-dns-675f4bcbfc-58jcj\" (UID: \"0f341d0c-6709-4465-8679-513a8e659fb1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.602698 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-config\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.602704 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.603681 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f341d0c-6709-4465-8679-513a8e659fb1-config\") pod \"dnsmasq-dns-675f4bcbfc-58jcj\" (UID: \"0f341d0c-6709-4465-8679-513a8e659fb1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.621406 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bksd\" (UniqueName: \"kubernetes.io/projected/aad9a413-82e1-48d0-8d36-6e522215d0b6-kube-api-access-8bksd\") pod \"dnsmasq-dns-78dd6ddcc-79s5t\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.625903 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjwk5\" (UniqueName: \"kubernetes.io/projected/0f341d0c-6709-4465-8679-513a8e659fb1-kube-api-access-jjwk5\") pod \"dnsmasq-dns-675f4bcbfc-58jcj\" (UID: \"0f341d0c-6709-4465-8679-513a8e659fb1\") " pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.682605 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.859021 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7486f533-2fa2-4e7b-98f2-1ab4121b3bf7" path="/var/lib/kubelet/pods/7486f533-2fa2-4e7b-98f2-1ab4121b3bf7/volumes" Mar 20 17:32:44 crc kubenswrapper[4803]: I0320 17:32:44.921080 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:32:45 crc kubenswrapper[4803]: I0320 17:32:45.126027 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-79s5t"] Mar 20 17:32:45 crc kubenswrapper[4803]: I0320 17:32:45.134485 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-58jcj"] Mar 20 17:32:45 crc kubenswrapper[4803]: I0320 17:32:45.280444 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" event={"ID":"aad9a413-82e1-48d0-8d36-6e522215d0b6","Type":"ContainerStarted","Data":"a6b658d8e70be4751f4b3f2343e0e3f176819e9252a45d1e568566759275f48c"} Mar 20 17:32:45 crc kubenswrapper[4803]: I0320 17:32:45.281514 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" event={"ID":"0f341d0c-6709-4465-8679-513a8e659fb1","Type":"ContainerStarted","Data":"aecb644bf77db927a075d83a0bd086503ee461df4e898def2a9d7eeb7f90b2e0"} Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.085670 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-58jcj"] Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.109993 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pflx7"] Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.111150 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.118039 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pflx7"] Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.242437 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.242870 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-config\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.242938 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5vpz\" (UniqueName: \"kubernetes.io/projected/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-kube-api-access-g5vpz\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.345650 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5vpz\" (UniqueName: \"kubernetes.io/projected/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-kube-api-access-g5vpz\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.345768 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.345792 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-config\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.346721 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-config\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.346729 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.388452 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5vpz\" (UniqueName: \"kubernetes.io/projected/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-kube-api-access-g5vpz\") pod \"dnsmasq-dns-666b6646f7-pflx7\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.430985 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.473145 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-79s5t"] Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.506041 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gw794"] Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.507257 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.526339 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gw794"] Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.550091 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-config\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.550136 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k77l\" (UniqueName: \"kubernetes.io/projected/f122552a-098b-4046-9a7b-82f89b7aeed7-kube-api-access-8k77l\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.550180 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.652305 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-config\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.652378 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-config\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.652423 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k77l\" (UniqueName: \"kubernetes.io/projected/f122552a-098b-4046-9a7b-82f89b7aeed7-kube-api-access-8k77l\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.652743 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.653329 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.667635 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k77l\" (UniqueName: \"kubernetes.io/projected/f122552a-098b-4046-9a7b-82f89b7aeed7-kube-api-access-8k77l\") pod \"dnsmasq-dns-57d769cc4f-gw794\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.883467 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:32:47 crc kubenswrapper[4803]: I0320 17:32:47.961786 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pflx7"] Mar 20 17:32:47 crc kubenswrapper[4803]: W0320 17:32:47.965212 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod733fdcd1_a9fa_4454_a2dc_c6c51c5ceebc.slice/crio-03906fc99af8c988f89975c2cfa182910480410662f6db466782262359ac8eb3 WatchSource:0}: Error finding container 03906fc99af8c988f89975c2cfa182910480410662f6db466782262359ac8eb3: Status 404 returned error can't find the container with id 03906fc99af8c988f89975c2cfa182910480410662f6db466782262359ac8eb3 Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.082625 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.085513 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.088493 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rc669" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.088713 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.088807 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.088993 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.089085 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.089245 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.089132 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.106855 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.264798 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265128 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265152 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265186 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265221 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxgml\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-kube-api-access-bxgml\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265248 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265266 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-config-data\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265290 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b537f5f-54f2-4c70-be7b-4c57f84c572c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265305 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265322 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b537f5f-54f2-4c70-be7b-4c57f84c572c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.265344 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.306978 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" event={"ID":"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc","Type":"ContainerStarted","Data":"03906fc99af8c988f89975c2cfa182910480410662f6db466782262359ac8eb3"} Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.325407 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gw794"] Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366614 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366654 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366694 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366712 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxgml\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-kube-api-access-bxgml\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366738 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366755 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-config-data\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366780 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b537f5f-54f2-4c70-be7b-4c57f84c572c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366794 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366807 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b537f5f-54f2-4c70-be7b-4c57f84c572c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366827 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366854 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.366978 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.367131 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.367259 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.369078 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.369393 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-config-data\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.369684 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.373606 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.374359 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b537f5f-54f2-4c70-be7b-4c57f84c572c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.374447 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.380534 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b537f5f-54f2-4c70-be7b-4c57f84c572c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.388311 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxgml\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-kube-api-access-bxgml\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.392823 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.409807 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.411053 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.414809 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.414986 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.415090 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.415054 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.416451 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.417350 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.427656 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.428408 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.436297 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4kqgt" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569113 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569166 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569200 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569232 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569394 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569437 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9w4d\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-kube-api-access-s9w4d\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569544 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569564 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569697 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569764 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.569813 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.671779 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.671851 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.671874 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.672311 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.672337 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.672368 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.672384 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9w4d\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-kube-api-access-s9w4d\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.672417 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.672433 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.672474 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.672559 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.673029 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.673046 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.673280 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.673478 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.674091 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.674746 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.677061 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.677235 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.680185 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.682513 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.688808 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9w4d\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-kube-api-access-s9w4d\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.697016 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:48 crc kubenswrapper[4803]: I0320 17:32:48.765850 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.855691 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.858131 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.861143 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.861845 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lb7rl" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.862283 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.862384 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.870371 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.874948 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.992626 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-config-data-default\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.992673 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be331baf-1bef-41ab-ac10-b8686ecb5a30-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.992697 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhm2f\" (UniqueName: \"kubernetes.io/projected/be331baf-1bef-41ab-ac10-b8686ecb5a30-kube-api-access-fhm2f\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.992735 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-kolla-config\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.992790 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.992849 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-operator-scripts\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.992876 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be331baf-1bef-41ab-ac10-b8686ecb5a30-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:49 crc kubenswrapper[4803]: I0320 17:32:49.992897 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be331baf-1bef-41ab-ac10-b8686ecb5a30-config-data-generated\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.094642 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-config-data-default\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.094699 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be331baf-1bef-41ab-ac10-b8686ecb5a30-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.094769 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhm2f\" (UniqueName: \"kubernetes.io/projected/be331baf-1bef-41ab-ac10-b8686ecb5a30-kube-api-access-fhm2f\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.094804 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-kolla-config\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.094845 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.094893 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-operator-scripts\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.094920 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be331baf-1bef-41ab-ac10-b8686ecb5a30-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.094955 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be331baf-1bef-41ab-ac10-b8686ecb5a30-config-data-generated\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.095288 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/be331baf-1bef-41ab-ac10-b8686ecb5a30-config-data-generated\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.095665 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.095729 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-config-data-default\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.096779 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-kolla-config\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.096835 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be331baf-1bef-41ab-ac10-b8686ecb5a30-operator-scripts\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.101318 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/be331baf-1bef-41ab-ac10-b8686ecb5a30-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.101383 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be331baf-1bef-41ab-ac10-b8686ecb5a30-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.116146 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhm2f\" (UniqueName: \"kubernetes.io/projected/be331baf-1bef-41ab-ac10-b8686ecb5a30-kube-api-access-fhm2f\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.135262 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"be331baf-1bef-41ab-ac10-b8686ecb5a30\") " pod="openstack/openstack-galera-0" Mar 20 17:32:50 crc kubenswrapper[4803]: I0320 17:32:50.194375 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.307463 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.309187 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.313943 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-r2rvj" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.314832 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.314972 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.315093 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.323472 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.416834 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.416895 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.416925 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.416984 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1b71013-6f7a-4559-a3bc-af90c284cada-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.417007 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt5qp\" (UniqueName: \"kubernetes.io/projected/f1b71013-6f7a-4559-a3bc-af90c284cada-kube-api-access-zt5qp\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.417036 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.417081 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b71013-6f7a-4559-a3bc-af90c284cada-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.417162 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b71013-6f7a-4559-a3bc-af90c284cada-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.518469 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.518509 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.518563 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1b71013-6f7a-4559-a3bc-af90c284cada-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.518584 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt5qp\" (UniqueName: \"kubernetes.io/projected/f1b71013-6f7a-4559-a3bc-af90c284cada-kube-api-access-zt5qp\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.518609 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.518659 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b71013-6f7a-4559-a3bc-af90c284cada-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.518715 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.518931 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b71013-6f7a-4559-a3bc-af90c284cada-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.519011 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f1b71013-6f7a-4559-a3bc-af90c284cada-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.519021 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.519841 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.520276 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.524578 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1b71013-6f7a-4559-a3bc-af90c284cada-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.524596 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1b71013-6f7a-4559-a3bc-af90c284cada-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.525008 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f1b71013-6f7a-4559-a3bc-af90c284cada-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.536821 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.538391 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt5qp\" (UniqueName: \"kubernetes.io/projected/f1b71013-6f7a-4559-a3bc-af90c284cada-kube-api-access-zt5qp\") pod \"openstack-cell1-galera-0\" (UID: \"f1b71013-6f7a-4559-a3bc-af90c284cada\") " pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.560137 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.561694 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.563776 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.564171 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gn2ds" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.564237 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.566751 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.620721 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd627740-5358-468a-bb90-21d52992a407-config-data\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.620942 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dxj\" (UniqueName: \"kubernetes.io/projected/cd627740-5358-468a-bb90-21d52992a407-kube-api-access-k4dxj\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.621021 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd627740-5358-468a-bb90-21d52992a407-kolla-config\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.621113 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd627740-5358-468a-bb90-21d52992a407-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.621264 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd627740-5358-468a-bb90-21d52992a407-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.672860 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.722304 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dxj\" (UniqueName: \"kubernetes.io/projected/cd627740-5358-468a-bb90-21d52992a407-kube-api-access-k4dxj\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.722368 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd627740-5358-468a-bb90-21d52992a407-kolla-config\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.722404 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd627740-5358-468a-bb90-21d52992a407-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.722448 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd627740-5358-468a-bb90-21d52992a407-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.722537 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd627740-5358-468a-bb90-21d52992a407-config-data\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.723487 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd627740-5358-468a-bb90-21d52992a407-kolla-config\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.723911 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd627740-5358-468a-bb90-21d52992a407-config-data\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.727743 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd627740-5358-468a-bb90-21d52992a407-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.728475 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd627740-5358-468a-bb90-21d52992a407-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.740353 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dxj\" (UniqueName: \"kubernetes.io/projected/cd627740-5358-468a-bb90-21d52992a407-kube-api-access-k4dxj\") pod \"memcached-0\" (UID: \"cd627740-5358-468a-bb90-21d52992a407\") " pod="openstack/memcached-0" Mar 20 17:32:51 crc kubenswrapper[4803]: I0320 17:32:51.898844 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 17:32:52 crc kubenswrapper[4803]: I0320 17:32:52.364038 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" event={"ID":"f122552a-098b-4046-9a7b-82f89b7aeed7","Type":"ContainerStarted","Data":"1f06f54baef0292a8b0fb2ac5463fc2c7fed72120a8a366039097e0372813d2d"} Mar 20 17:32:53 crc kubenswrapper[4803]: I0320 17:32:53.945596 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:32:53 crc kubenswrapper[4803]: I0320 17:32:53.947171 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:32:53 crc kubenswrapper[4803]: I0320 17:32:53.954274 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-khm49" Mar 20 17:32:53 crc kubenswrapper[4803]: I0320 17:32:53.962243 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:32:54 crc kubenswrapper[4803]: I0320 17:32:54.060825 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcjrb\" (UniqueName: \"kubernetes.io/projected/3df34a16-c42d-48fa-90d4-711071acb1a8-kube-api-access-lcjrb\") pod \"kube-state-metrics-0\" (UID: \"3df34a16-c42d-48fa-90d4-711071acb1a8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:32:54 crc kubenswrapper[4803]: I0320 17:32:54.162131 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcjrb\" (UniqueName: \"kubernetes.io/projected/3df34a16-c42d-48fa-90d4-711071acb1a8-kube-api-access-lcjrb\") pod \"kube-state-metrics-0\" (UID: \"3df34a16-c42d-48fa-90d4-711071acb1a8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:32:54 crc kubenswrapper[4803]: I0320 17:32:54.179436 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcjrb\" (UniqueName: \"kubernetes.io/projected/3df34a16-c42d-48fa-90d4-711071acb1a8-kube-api-access-lcjrb\") pod \"kube-state-metrics-0\" (UID: \"3df34a16-c42d-48fa-90d4-711071acb1a8\") " pod="openstack/kube-state-metrics-0" Mar 20 17:32:54 crc kubenswrapper[4803]: I0320 17:32:54.285977 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.635883 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-56z85"] Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.637117 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.640208 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-27d8m" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.640368 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.640429 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.658838 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56z85"] Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.712433 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-l7cc6"] Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.712892 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-combined-ca-bundle\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.712971 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-run-ovn\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.713009 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf6w9\" (UniqueName: \"kubernetes.io/projected/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-kube-api-access-rf6w9\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.713038 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-run\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.713075 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-ovn-controller-tls-certs\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.713110 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-log-ovn\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.713133 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-scripts\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.719503 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.723487 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l7cc6"] Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.813758 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-run-ovn\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.813803 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf6w9\" (UniqueName: \"kubernetes.io/projected/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-kube-api-access-rf6w9\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.813829 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-lib\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.814871 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-run\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.814348 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-run-ovn\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.814950 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-ovn-controller-tls-certs\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.814993 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-log-ovn\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.815010 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-log\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.815073 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-run\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.815201 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-var-log-ovn\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.815029 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlkxt\" (UniqueName: \"kubernetes.io/projected/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-kube-api-access-mlkxt\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.815249 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-scripts\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.815270 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-scripts\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.816730 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-run\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.816764 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-combined-ca-bundle\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.816783 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-etc-ovs\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.817048 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-scripts\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.821173 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-ovn-controller-tls-certs\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.821222 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-combined-ca-bundle\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.833161 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf6w9\" (UniqueName: \"kubernetes.io/projected/1d21c995-a420-4ac7-8cd9-c186be9e4ba0-kube-api-access-rf6w9\") pod \"ovn-controller-56z85\" (UID: \"1d21c995-a420-4ac7-8cd9-c186be9e4ba0\") " pod="openstack/ovn-controller-56z85" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.918199 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-lib\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.918349 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-log\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.918370 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlkxt\" (UniqueName: \"kubernetes.io/projected/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-kube-api-access-mlkxt\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.918405 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-scripts\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.918468 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-run\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.918497 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-etc-ovs\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.918545 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-lib\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.918853 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-run\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.919018 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-var-log\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.919216 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-etc-ovs\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.921693 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-scripts\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.937458 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlkxt\" (UniqueName: \"kubernetes.io/projected/d0bcca9a-5da9-4ffc-897f-e3a0ae093324-kube-api-access-mlkxt\") pod \"ovn-controller-ovs-l7cc6\" (UID: \"d0bcca9a-5da9-4ffc-897f-e3a0ae093324\") " pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:57 crc kubenswrapper[4803]: I0320 17:32:57.966973 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56z85" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.035448 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.308436 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.310135 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.312788 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-6dzjc" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.312980 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.313107 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.313755 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.315928 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.317842 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.424302 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c9af9eb-0645-4fe0-a558-6ae86595685e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.424343 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.424375 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c9af9eb-0645-4fe0-a558-6ae86595685e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.424407 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.424633 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.424789 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnnq4\" (UniqueName: \"kubernetes.io/projected/7c9af9eb-0645-4fe0-a558-6ae86595685e-kube-api-access-nnnq4\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.424839 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9af9eb-0645-4fe0-a558-6ae86595685e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.424907 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527000 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527102 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnnq4\" (UniqueName: \"kubernetes.io/projected/7c9af9eb-0645-4fe0-a558-6ae86595685e-kube-api-access-nnnq4\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527138 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9af9eb-0645-4fe0-a558-6ae86595685e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527176 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527219 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c9af9eb-0645-4fe0-a558-6ae86595685e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527249 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527287 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c9af9eb-0645-4fe0-a558-6ae86595685e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527300 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527323 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.527998 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c9af9eb-0645-4fe0-a558-6ae86595685e-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.528169 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9af9eb-0645-4fe0-a558-6ae86595685e-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.528505 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c9af9eb-0645-4fe0-a558-6ae86595685e-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.531390 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.531834 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.543033 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnnq4\" (UniqueName: \"kubernetes.io/projected/7c9af9eb-0645-4fe0-a558-6ae86595685e-kube-api-access-nnnq4\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.548263 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c9af9eb-0645-4fe0-a558-6ae86595685e-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.557034 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"7c9af9eb-0645-4fe0-a558-6ae86595685e\") " pod="openstack/ovsdbserver-sb-0" Mar 20 17:32:58 crc kubenswrapper[4803]: I0320 17:32:58.632180 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: E0320 17:33:00.220849 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 17:33:00 crc kubenswrapper[4803]: E0320 17:33:00.221837 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jjwk5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-58jcj_openstack(0f341d0c-6709-4465-8679-513a8e659fb1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:33:00 crc kubenswrapper[4803]: E0320 17:33:00.223695 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" podUID="0f341d0c-6709-4465-8679-513a8e659fb1" Mar 20 17:33:00 crc kubenswrapper[4803]: E0320 17:33:00.262052 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 17:33:00 crc kubenswrapper[4803]: E0320 17:33:00.265691 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8bksd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-79s5t_openstack(aad9a413-82e1-48d0-8d36-6e522215d0b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:33:00 crc kubenswrapper[4803]: E0320 17:33:00.266957 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" podUID="aad9a413-82e1-48d0-8d36-6e522215d0b6" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.644705 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.657585 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.659931 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.683968 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pghf4" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.688827 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.689119 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.689274 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.706783 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.776762 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.776806 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.776844 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.776869 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.776898 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd6p7\" (UniqueName: \"kubernetes.io/projected/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-kube-api-access-qd6p7\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.776944 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.776971 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.777063 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-config\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.878845 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879247 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879340 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-config\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879422 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879504 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879638 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879716 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879726 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879839 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd6p7\" (UniqueName: \"kubernetes.io/projected/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-kube-api-access-qd6p7\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.879272 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.887239 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-config\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.887965 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.894946 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.896081 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.900029 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.902293 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd6p7\" (UniqueName: \"kubernetes.io/projected/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-kube-api-access-qd6p7\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.908142 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab4e6fd1-195c-4ff0-8288-14454e4ea4f1-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:00 crc kubenswrapper[4803]: I0320 17:33:00.917928 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1\") " pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.196350 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.277239 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.290982 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.300855 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 17:33:01 crc kubenswrapper[4803]: W0320 17:33:01.302680 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe331baf_1bef_41ab_ac10_b8686ecb5a30.slice/crio-a63d7c9527af4acde9a0408da7886156e467fa59292ee3b1c1641a4b7dbc1d0e WatchSource:0}: Error finding container a63d7c9527af4acde9a0408da7886156e467fa59292ee3b1c1641a4b7dbc1d0e: Status 404 returned error can't find the container with id a63d7c9527af4acde9a0408da7886156e467fa59292ee3b1c1641a4b7dbc1d0e Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.345378 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.375129 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.388268 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjwk5\" (UniqueName: \"kubernetes.io/projected/0f341d0c-6709-4465-8679-513a8e659fb1-kube-api-access-jjwk5\") pod \"0f341d0c-6709-4465-8679-513a8e659fb1\" (UID: \"0f341d0c-6709-4465-8679-513a8e659fb1\") " Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.388414 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-dns-svc\") pod \"aad9a413-82e1-48d0-8d36-6e522215d0b6\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.388437 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-config\") pod \"aad9a413-82e1-48d0-8d36-6e522215d0b6\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.388466 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bksd\" (UniqueName: \"kubernetes.io/projected/aad9a413-82e1-48d0-8d36-6e522215d0b6-kube-api-access-8bksd\") pod \"aad9a413-82e1-48d0-8d36-6e522215d0b6\" (UID: \"aad9a413-82e1-48d0-8d36-6e522215d0b6\") " Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.388504 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f341d0c-6709-4465-8679-513a8e659fb1-config\") pod \"0f341d0c-6709-4465-8679-513a8e659fb1\" (UID: \"0f341d0c-6709-4465-8679-513a8e659fb1\") " Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.389224 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f341d0c-6709-4465-8679-513a8e659fb1-config" (OuterVolumeSpecName: "config") pod "0f341d0c-6709-4465-8679-513a8e659fb1" (UID: "0f341d0c-6709-4465-8679-513a8e659fb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.389416 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.389943 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aad9a413-82e1-48d0-8d36-6e522215d0b6" (UID: "aad9a413-82e1-48d0-8d36-6e522215d0b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.390047 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-config" (OuterVolumeSpecName: "config") pod "aad9a413-82e1-48d0-8d36-6e522215d0b6" (UID: "aad9a413-82e1-48d0-8d36-6e522215d0b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.394328 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f341d0c-6709-4465-8679-513a8e659fb1-kube-api-access-jjwk5" (OuterVolumeSpecName: "kube-api-access-jjwk5") pod "0f341d0c-6709-4465-8679-513a8e659fb1" (UID: "0f341d0c-6709-4465-8679-513a8e659fb1"). InnerVolumeSpecName "kube-api-access-jjwk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.396431 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad9a413-82e1-48d0-8d36-6e522215d0b6-kube-api-access-8bksd" (OuterVolumeSpecName: "kube-api-access-8bksd") pod "aad9a413-82e1-48d0-8d36-6e522215d0b6" (UID: "aad9a413-82e1-48d0-8d36-6e522215d0b6"). InnerVolumeSpecName "kube-api-access-8bksd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:01 crc kubenswrapper[4803]: W0320 17:33:01.414078 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c9af9eb_0645_4fe0_a558_6ae86595685e.slice/crio-edd344e66b6cb7a205eb732b4b59d3020aecc892e0990ff445c7e617582af890 WatchSource:0}: Error finding container edd344e66b6cb7a205eb732b4b59d3020aecc892e0990ff445c7e617582af890: Status 404 returned error can't find the container with id edd344e66b6cb7a205eb732b4b59d3020aecc892e0990ff445c7e617582af890 Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.435802 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 17:33:01 crc kubenswrapper[4803]: W0320 17:33:01.445974 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd627740_5358_468a_bb90_21d52992a407.slice/crio-20fb47d02306a8f56c974758504c4dd289474ac6f92380f01eabee1d97b308c9 WatchSource:0}: Error finding container 20fb47d02306a8f56c974758504c4dd289474ac6f92380f01eabee1d97b308c9: Status 404 returned error can't find the container with id 20fb47d02306a8f56c974758504c4dd289474ac6f92380f01eabee1d97b308c9 Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.451363 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56z85"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.480195 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l7cc6"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.490330 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.490355 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad9a413-82e1-48d0-8d36-6e522215d0b6-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.490365 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bksd\" (UniqueName: \"kubernetes.io/projected/aad9a413-82e1-48d0-8d36-6e522215d0b6-kube-api-access-8bksd\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.490401 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f341d0c-6709-4465-8679-513a8e659fb1-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.490412 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjwk5\" (UniqueName: \"kubernetes.io/projected/0f341d0c-6709-4465-8679-513a8e659fb1-kube-api-access-jjwk5\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:01 crc kubenswrapper[4803]: W0320 17:33:01.586674 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0bcca9a_5da9_4ffc_897f_e3a0ae093324.slice/crio-3b08fcf0d1e5989fa43065fff0cbc6d8e0a216c2ce09fbbe289f68fd3c9ac7ce WatchSource:0}: Error finding container 3b08fcf0d1e5989fa43065fff0cbc6d8e0a216c2ce09fbbe289f68fd3c9ac7ce: Status 404 returned error can't find the container with id 3b08fcf0d1e5989fa43065fff0cbc6d8e0a216c2ce09fbbe289f68fd3c9ac7ce Mar 20 17:33:01 crc kubenswrapper[4803]: W0320 17:33:01.587897 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d21c995_a420_4ac7_8cd9_c186be9e4ba0.slice/crio-23e1e0041b7bc7fe00854197bd76852be8fb6a7ea2041690bf13dbeb60cf1d74 WatchSource:0}: Error finding container 23e1e0041b7bc7fe00854197bd76852be8fb6a7ea2041690bf13dbeb60cf1d74: Status 404 returned error can't find the container with id 23e1e0041b7bc7fe00854197bd76852be8fb6a7ea2041690bf13dbeb60cf1d74 Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.691892 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3df34a16-c42d-48fa-90d4-711071acb1a8","Type":"ContainerStarted","Data":"ad10c7c858414480d4cf90af3d454ebc2fe2d6af5404defe34bf65b8a781ac5b"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.693881 4803 generic.go:334] "Generic (PLEG): container finished" podID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" containerID="803a37cf1277419dbb96bac0c257d5eb59f7f7af3a1e1e54a4e638a129d7bc13" exitCode=0 Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.693950 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" event={"ID":"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc","Type":"ContainerDied","Data":"803a37cf1277419dbb96bac0c257d5eb59f7f7af3a1e1e54a4e638a129d7bc13"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.695187 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b537f5f-54f2-4c70-be7b-4c57f84c572c","Type":"ContainerStarted","Data":"74d965a7cf07649bab222faee00aedfa14447bb2ced0ee5316e7ff0dfb875c3d"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.697218 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l7cc6" event={"ID":"d0bcca9a-5da9-4ffc-897f-e3a0ae093324","Type":"ContainerStarted","Data":"3b08fcf0d1e5989fa43065fff0cbc6d8e0a216c2ce09fbbe289f68fd3c9ac7ce"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.698488 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f1b71013-6f7a-4559-a3bc-af90c284cada","Type":"ContainerStarted","Data":"9894da7233e1b166daea8cc44bf73818e93ca0c2a9fa11f4a7f2a5b044c9d59b"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.699708 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"be331baf-1bef-41ab-ac10-b8686ecb5a30","Type":"ContainerStarted","Data":"a63d7c9527af4acde9a0408da7886156e467fa59292ee3b1c1641a4b7dbc1d0e"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.703137 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" event={"ID":"aad9a413-82e1-48d0-8d36-6e522215d0b6","Type":"ContainerDied","Data":"a6b658d8e70be4751f4b3f2343e0e3f176819e9252a45d1e568566759275f48c"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.703207 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-79s5t" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.705109 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cd627740-5358-468a-bb90-21d52992a407","Type":"ContainerStarted","Data":"20fb47d02306a8f56c974758504c4dd289474ac6f92380f01eabee1d97b308c9"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.710792 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56z85" event={"ID":"1d21c995-a420-4ac7-8cd9-c186be9e4ba0","Type":"ContainerStarted","Data":"23e1e0041b7bc7fe00854197bd76852be8fb6a7ea2041690bf13dbeb60cf1d74"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.712917 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" event={"ID":"0f341d0c-6709-4465-8679-513a8e659fb1","Type":"ContainerDied","Data":"aecb644bf77db927a075d83a0bd086503ee461df4e898def2a9d7eeb7f90b2e0"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.712993 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-58jcj" Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.718466 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c9af9eb-0645-4fe0-a558-6ae86595685e","Type":"ContainerStarted","Data":"edd344e66b6cb7a205eb732b4b59d3020aecc892e0990ff445c7e617582af890"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.724266 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b4b88aa-f18b-40c9-b8ad-dbd3739565da","Type":"ContainerStarted","Data":"a9f8181a3c133fcf816ab8ce69e6138c45c81660b198e72764725377ab38fc99"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.725768 4803 generic.go:334] "Generic (PLEG): container finished" podID="f122552a-098b-4046-9a7b-82f89b7aeed7" containerID="ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370" exitCode=0 Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.725808 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" event={"ID":"f122552a-098b-4046-9a7b-82f89b7aeed7","Type":"ContainerDied","Data":"ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370"} Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.786623 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-58jcj"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.802662 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-58jcj"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.822719 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-79s5t"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.831492 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-79s5t"] Mar 20 17:33:01 crc kubenswrapper[4803]: I0320 17:33:01.864062 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 17:33:02 crc kubenswrapper[4803]: I0320 17:33:02.735376 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1","Type":"ContainerStarted","Data":"c2526a7acc7523b33ced287085b1a8299665a18330a4551cd33a2e770b6a853a"} Mar 20 17:33:02 crc kubenswrapper[4803]: I0320 17:33:02.857944 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f341d0c-6709-4465-8679-513a8e659fb1" path="/var/lib/kubelet/pods/0f341d0c-6709-4465-8679-513a8e659fb1/volumes" Mar 20 17:33:02 crc kubenswrapper[4803]: I0320 17:33:02.858550 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad9a413-82e1-48d0-8d36-6e522215d0b6" path="/var/lib/kubelet/pods/aad9a413-82e1-48d0-8d36-6e522215d0b6/volumes" Mar 20 17:33:08 crc kubenswrapper[4803]: I0320 17:33:08.245796 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:33:08 crc kubenswrapper[4803]: I0320 17:33:08.246459 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:33:08 crc kubenswrapper[4803]: I0320 17:33:08.246506 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:33:08 crc kubenswrapper[4803]: I0320 17:33:08.247218 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b7da3dcd15f360247e71d17a90f38b394013be03e278a625e311daaaf87c2cd"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:33:08 crc kubenswrapper[4803]: I0320 17:33:08.247300 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://6b7da3dcd15f360247e71d17a90f38b394013be03e278a625e311daaaf87c2cd" gracePeriod=600 Mar 20 17:33:08 crc kubenswrapper[4803]: I0320 17:33:08.785810 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="6b7da3dcd15f360247e71d17a90f38b394013be03e278a625e311daaaf87c2cd" exitCode=0 Mar 20 17:33:08 crc kubenswrapper[4803]: I0320 17:33:08.785848 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"6b7da3dcd15f360247e71d17a90f38b394013be03e278a625e311daaaf87c2cd"} Mar 20 17:33:08 crc kubenswrapper[4803]: I0320 17:33:08.785908 4803 scope.go:117] "RemoveContainer" containerID="b24a42c256c66e693667314faf1192fa4d816bc92003b13635ba6ca267c3a888" Mar 20 17:33:11 crc kubenswrapper[4803]: I0320 17:33:11.194700 4803 scope.go:117] "RemoveContainer" containerID="ba3a027d817d4d5f8a7593861130eb6761ed9fc3fbada53e331d2f88668691c5" Mar 20 17:33:14 crc kubenswrapper[4803]: E0320 17:33:14.084551 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 17:33:14 crc kubenswrapper[4803]: E0320 17:33:14.085178 4803 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 17:33:14 crc kubenswrapper[4803]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 17:33:14 crc kubenswrapper[4803]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 17:33:14 crc kubenswrapper[4803]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 17:33:14 crc kubenswrapper[4803]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 17:33:14 crc kubenswrapper[4803]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:33:14 crc kubenswrapper[4803]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:33:14 crc kubenswrapper[4803]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 17:33:14 crc kubenswrapper[4803]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 17:33:14 crc kubenswrapper[4803]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxgml,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(4b537f5f-54f2-4c70-be7b-4c57f84c572c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 17:33:14 crc kubenswrapper[4803]: > logger="UnhandledError" Mar 20 17:33:14 crc kubenswrapper[4803]: E0320 17:33:14.086416 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.832442 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56z85" event={"ID":"1d21c995-a420-4ac7-8cd9-c186be9e4ba0","Type":"ContainerStarted","Data":"fc3cbd834be6dc4dcf7992bff48161736fea15abf9f44ed53b8c3392be65ba53"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.832942 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-56z85" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.834332 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c9af9eb-0645-4fe0-a558-6ae86595685e","Type":"ContainerStarted","Data":"eaa60d4f36433d146c2717428953ebe837a0a6b7f6c6f2aab6d4669d50791b6d"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.835900 4803 generic.go:334] "Generic (PLEG): container finished" podID="d0bcca9a-5da9-4ffc-897f-e3a0ae093324" containerID="ae8943e13ba9f56096b3a40229e145318f28f4353a92762d8af7b6efc70c44ae" exitCode=0 Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.835981 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l7cc6" event={"ID":"d0bcca9a-5da9-4ffc-897f-e3a0ae093324","Type":"ContainerDied","Data":"ae8943e13ba9f56096b3a40229e145318f28f4353a92762d8af7b6efc70c44ae"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.837941 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"be331baf-1bef-41ab-ac10-b8686ecb5a30","Type":"ContainerStarted","Data":"0314bf964cdd872f97d5b922059aa51a6f55ac8887870fcfb63b4ed787a9cf0f"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.839815 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3df34a16-c42d-48fa-90d4-711071acb1a8","Type":"ContainerStarted","Data":"be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.839992 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.843031 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"1bf505c950c915ca8f0c808968521bcabc9e302497ac83cfddf349e9c7d8bb55"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.844618 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cd627740-5358-468a-bb90-21d52992a407","Type":"ContainerStarted","Data":"59462b49ac954295e14d203cb838380e5d7f6d89bd7d0e430357460a2db466c0"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.844742 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.861634 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.861687 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" event={"ID":"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc","Type":"ContainerStarted","Data":"fe0dd3bde21dad01158d94818b90e69352faf707bed9dbc343c031fda9784ed2"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.861707 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.861721 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" event={"ID":"f122552a-098b-4046-9a7b-82f89b7aeed7","Type":"ContainerStarted","Data":"c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.861732 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f1b71013-6f7a-4559-a3bc-af90c284cada","Type":"ContainerStarted","Data":"76271f958fa321714f27ed8196b368246d326659943dd6df698749ab43d1ae0a"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.861746 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1","Type":"ContainerStarted","Data":"3481e475a94c774fda61e09fbc23325ab02af73ef5ccf3316666461dabc73190"} Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.867450 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-56z85" podStartSLOduration=5.456755053 podStartE2EDuration="17.867432989s" podCreationTimestamp="2026-03-20 17:32:57 +0000 UTC" firstStartedPulling="2026-03-20 17:33:01.590370708 +0000 UTC m=+991.501962778" lastFinishedPulling="2026-03-20 17:33:14.001048634 +0000 UTC m=+1003.912640714" observedRunningTime="2026-03-20 17:33:14.859047307 +0000 UTC m=+1004.770639377" watchObservedRunningTime="2026-03-20 17:33:14.867432989 +0000 UTC m=+1004.779025059" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.915843 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" podStartSLOduration=15.452203219 podStartE2EDuration="27.915825683s" podCreationTimestamp="2026-03-20 17:32:47 +0000 UTC" firstStartedPulling="2026-03-20 17:32:47.968231832 +0000 UTC m=+977.879823902" lastFinishedPulling="2026-03-20 17:33:00.431854296 +0000 UTC m=+990.343446366" observedRunningTime="2026-03-20 17:33:14.913798855 +0000 UTC m=+1004.825390935" watchObservedRunningTime="2026-03-20 17:33:14.915825683 +0000 UTC m=+1004.827417753" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.936313 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" podStartSLOduration=19.06938503 podStartE2EDuration="27.936295663s" podCreationTimestamp="2026-03-20 17:32:47 +0000 UTC" firstStartedPulling="2026-03-20 17:32:51.539742866 +0000 UTC m=+981.451334936" lastFinishedPulling="2026-03-20 17:33:00.406653499 +0000 UTC m=+990.318245569" observedRunningTime="2026-03-20 17:33:14.930604209 +0000 UTC m=+1004.842196289" watchObservedRunningTime="2026-03-20 17:33:14.936295663 +0000 UTC m=+1004.847887753" Mar 20 17:33:14 crc kubenswrapper[4803]: I0320 17:33:14.967862 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.419514017000001 podStartE2EDuration="21.967836822s" podCreationTimestamp="2026-03-20 17:32:53 +0000 UTC" firstStartedPulling="2026-03-20 17:33:00.86926097 +0000 UTC m=+990.780853030" lastFinishedPulling="2026-03-20 17:33:12.417583745 +0000 UTC m=+1002.329175835" observedRunningTime="2026-03-20 17:33:14.962433897 +0000 UTC m=+1004.874026017" watchObservedRunningTime="2026-03-20 17:33:14.967836822 +0000 UTC m=+1004.879428892" Mar 20 17:33:15 crc kubenswrapper[4803]: I0320 17:33:15.867832 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b4b88aa-f18b-40c9-b8ad-dbd3739565da","Type":"ContainerStarted","Data":"cb59d50529a5a90c57fcd7f2353539848ddae2913f0390264bfdfe1eccbb70f0"} Mar 20 17:33:15 crc kubenswrapper[4803]: I0320 17:33:15.873498 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l7cc6" event={"ID":"d0bcca9a-5da9-4ffc-897f-e3a0ae093324","Type":"ContainerStarted","Data":"e910cf764f83b891294940abcc9018cebcb8de39ae892317a8bb5ebbe34c8cea"} Mar 20 17:33:15 crc kubenswrapper[4803]: I0320 17:33:15.897642 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=14.880347254 podStartE2EDuration="24.897625504s" podCreationTimestamp="2026-03-20 17:32:51 +0000 UTC" firstStartedPulling="2026-03-20 17:33:01.450464166 +0000 UTC m=+991.362056236" lastFinishedPulling="2026-03-20 17:33:11.467742126 +0000 UTC m=+1001.379334486" observedRunningTime="2026-03-20 17:33:15.027884312 +0000 UTC m=+1004.939476392" watchObservedRunningTime="2026-03-20 17:33:15.897625504 +0000 UTC m=+1005.809217574" Mar 20 17:33:16 crc kubenswrapper[4803]: I0320 17:33:16.886211 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b537f5f-54f2-4c70-be7b-4c57f84c572c","Type":"ContainerStarted","Data":"052d5539bc83f751f8ac5bfbf90e0df4d4248a9132e6e3ff8ad18e6caaa138b0"} Mar 20 17:33:16 crc kubenswrapper[4803]: I0320 17:33:16.893172 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l7cc6" event={"ID":"d0bcca9a-5da9-4ffc-897f-e3a0ae093324","Type":"ContainerStarted","Data":"2c28ae7e1e6e49f137063515ed5a82445edaf383db651304c9373f6075d86892"} Mar 20 17:33:16 crc kubenswrapper[4803]: I0320 17:33:16.893716 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:33:16 crc kubenswrapper[4803]: I0320 17:33:16.893759 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:33:16 crc kubenswrapper[4803]: I0320 17:33:16.944647 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-l7cc6" podStartSLOduration=9.996410545 podStartE2EDuration="19.944627134s" podCreationTimestamp="2026-03-20 17:32:57 +0000 UTC" firstStartedPulling="2026-03-20 17:33:01.590372408 +0000 UTC m=+991.501964478" lastFinishedPulling="2026-03-20 17:33:11.538588987 +0000 UTC m=+1001.450181067" observedRunningTime="2026-03-20 17:33:16.941318218 +0000 UTC m=+1006.852910298" watchObservedRunningTime="2026-03-20 17:33:16.944627134 +0000 UTC m=+1006.856219214" Mar 20 17:33:18 crc kubenswrapper[4803]: I0320 17:33:18.908003 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ab4e6fd1-195c-4ff0-8288-14454e4ea4f1","Type":"ContainerStarted","Data":"951b24f25a2f356ee1c73af5fd3990eadd7b0233787709cae7b57366be18699d"} Mar 20 17:33:18 crc kubenswrapper[4803]: I0320 17:33:18.911187 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c9af9eb-0645-4fe0-a558-6ae86595685e","Type":"ContainerStarted","Data":"33cae7c66edc6f4fbaa587850543c624b9c8542bdb420a3d7ab732a7d854a462"} Mar 20 17:33:18 crc kubenswrapper[4803]: I0320 17:33:18.912700 4803 generic.go:334] "Generic (PLEG): container finished" podID="f1b71013-6f7a-4559-a3bc-af90c284cada" containerID="76271f958fa321714f27ed8196b368246d326659943dd6df698749ab43d1ae0a" exitCode=0 Mar 20 17:33:18 crc kubenswrapper[4803]: I0320 17:33:18.912747 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f1b71013-6f7a-4559-a3bc-af90c284cada","Type":"ContainerDied","Data":"76271f958fa321714f27ed8196b368246d326659943dd6df698749ab43d1ae0a"} Mar 20 17:33:18 crc kubenswrapper[4803]: I0320 17:33:18.915407 4803 generic.go:334] "Generic (PLEG): container finished" podID="be331baf-1bef-41ab-ac10-b8686ecb5a30" containerID="0314bf964cdd872f97d5b922059aa51a6f55ac8887870fcfb63b4ed787a9cf0f" exitCode=0 Mar 20 17:33:18 crc kubenswrapper[4803]: I0320 17:33:18.915468 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"be331baf-1bef-41ab-ac10-b8686ecb5a30","Type":"ContainerDied","Data":"0314bf964cdd872f97d5b922059aa51a6f55ac8887870fcfb63b4ed787a9cf0f"} Mar 20 17:33:18 crc kubenswrapper[4803]: I0320 17:33:18.938259 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.91995354 podStartE2EDuration="19.93823557s" podCreationTimestamp="2026-03-20 17:32:59 +0000 UTC" firstStartedPulling="2026-03-20 17:33:02.203483955 +0000 UTC m=+992.115076025" lastFinishedPulling="2026-03-20 17:33:18.221765975 +0000 UTC m=+1008.133358055" observedRunningTime="2026-03-20 17:33:18.933912266 +0000 UTC m=+1008.845504396" watchObservedRunningTime="2026-03-20 17:33:18.93823557 +0000 UTC m=+1008.849827670" Mar 20 17:33:18 crc kubenswrapper[4803]: I0320 17:33:18.997088 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.174733466 podStartE2EDuration="21.997068885s" podCreationTimestamp="2026-03-20 17:32:57 +0000 UTC" firstStartedPulling="2026-03-20 17:33:01.415661724 +0000 UTC m=+991.327253794" lastFinishedPulling="2026-03-20 17:33:18.237997123 +0000 UTC m=+1008.149589213" observedRunningTime="2026-03-20 17:33:18.996008944 +0000 UTC m=+1008.907601014" watchObservedRunningTime="2026-03-20 17:33:18.997068885 +0000 UTC m=+1008.908660955" Mar 20 17:33:19 crc kubenswrapper[4803]: I0320 17:33:19.197562 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:19 crc kubenswrapper[4803]: I0320 17:33:19.242805 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:19 crc kubenswrapper[4803]: I0320 17:33:19.632669 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 17:33:19 crc kubenswrapper[4803]: I0320 17:33:19.697347 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 17:33:19 crc kubenswrapper[4803]: I0320 17:33:19.930991 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f1b71013-6f7a-4559-a3bc-af90c284cada","Type":"ContainerStarted","Data":"8e1310814af78e5fd660f244b8b7f82636caf21ddb1052735c39551653e14160"} Mar 20 17:33:19 crc kubenswrapper[4803]: I0320 17:33:19.934680 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"be331baf-1bef-41ab-ac10-b8686ecb5a30","Type":"ContainerStarted","Data":"107bba144282520fa34f70714d88ac07fcf9d7d13eff029cf673a0213cd9236e"} Mar 20 17:33:19 crc kubenswrapper[4803]: I0320 17:33:19.935846 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 17:33:19 crc kubenswrapper[4803]: I0320 17:33:19.935905 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.015921 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.016421 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.254634915 podStartE2EDuration="32.016388857s" podCreationTimestamp="2026-03-20 17:32:48 +0000 UTC" firstStartedPulling="2026-03-20 17:33:01.306994422 +0000 UTC m=+991.218586492" lastFinishedPulling="2026-03-20 17:33:14.068748314 +0000 UTC m=+1003.980340434" observedRunningTime="2026-03-20 17:33:20.004317809 +0000 UTC m=+1009.915909969" watchObservedRunningTime="2026-03-20 17:33:20.016388857 +0000 UTC m=+1009.927980957" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.017408 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.890152673 podStartE2EDuration="30.017397296s" podCreationTimestamp="2026-03-20 17:32:50 +0000 UTC" firstStartedPulling="2026-03-20 17:33:01.313820929 +0000 UTC m=+991.225412999" lastFinishedPulling="2026-03-20 17:33:12.441065552 +0000 UTC m=+1002.352657622" observedRunningTime="2026-03-20 17:33:19.967850648 +0000 UTC m=+1009.879442798" watchObservedRunningTime="2026-03-20 17:33:20.017397296 +0000 UTC m=+1009.928989396" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.020021 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.195441 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.195513 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.379823 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gw794"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.380172 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" podUID="f122552a-098b-4046-9a7b-82f89b7aeed7" containerName="dnsmasq-dns" containerID="cri-o://c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4" gracePeriod=10 Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.381775 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.424598 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-cjgnb"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.426606 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.428769 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.481060 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-kgbvn"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.481977 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.497811 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.512671 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-cjgnb"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.524575 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kgbvn"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.587771 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.588064 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.588202 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-config\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.588300 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbr9b\" (UniqueName: \"kubernetes.io/projected/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-kube-api-access-wbr9b\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.588385 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.588463 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.592782 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcqfc\" (UniqueName: \"kubernetes.io/projected/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-kube-api-access-pcqfc\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.592865 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-config\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.593079 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-combined-ca-bundle\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.593173 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-ovn-rundir\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.593263 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-ovs-rundir\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.607686 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.617482 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.617821 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-h79q5" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.617993 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.618349 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.626321 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.648661 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pflx7"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.648954 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" podUID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" containerName="dnsmasq-dns" containerID="cri-o://fe0dd3bde21dad01158d94818b90e69352faf707bed9dbc343c031fda9784ed2" gracePeriod=10 Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.651668 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.657911 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-88mf7"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.661093 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.664987 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.675230 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-88mf7"] Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.696295 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.696534 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02135c00-5c50-45c9-a206-85f9e60d9c6e-config\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.696645 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02135c00-5c50-45c9-a206-85f9e60d9c6e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.696721 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-dns-svc\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.696804 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-config\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.696875 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbr9b\" (UniqueName: \"kubernetes.io/projected/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-kube-api-access-wbr9b\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.696944 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffwgj\" (UniqueName: \"kubernetes.io/projected/afc78685-0e55-47af-9fd4-d800f0779296-kube-api-access-ffwgj\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697019 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697091 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697174 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k9lb\" (UniqueName: \"kubernetes.io/projected/02135c00-5c50-45c9-a206-85f9e60d9c6e-kube-api-access-4k9lb\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697285 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcqfc\" (UniqueName: \"kubernetes.io/projected/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-kube-api-access-pcqfc\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697361 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697434 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-config\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697502 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-combined-ca-bundle\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697593 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697678 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697757 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-ovn-rundir\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.697842 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.698026 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-config\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.698142 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-ovs-rundir\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.698279 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.698385 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02135c00-5c50-45c9-a206-85f9e60d9c6e-scripts\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.698403 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-ovs-rundir\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.698385 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-ovn-rundir\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.698670 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-config\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.698968 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.699419 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.699555 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-config\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.704603 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.705380 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-combined-ca-bundle\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.721882 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcqfc\" (UniqueName: \"kubernetes.io/projected/7cc9f89e-5d0f-4f92-93fa-c7a0133baf05-kube-api-access-pcqfc\") pod \"ovn-controller-metrics-kgbvn\" (UID: \"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05\") " pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.727858 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbr9b\" (UniqueName: \"kubernetes.io/projected/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-kube-api-access-wbr9b\") pod \"dnsmasq-dns-6bc7876d45-cjgnb\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.746786 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.800870 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k9lb\" (UniqueName: \"kubernetes.io/projected/02135c00-5c50-45c9-a206-85f9e60d9c6e-kube-api-access-4k9lb\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801547 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801582 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801613 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801662 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801687 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-config\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801729 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02135c00-5c50-45c9-a206-85f9e60d9c6e-scripts\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801758 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801782 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02135c00-5c50-45c9-a206-85f9e60d9c6e-config\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801807 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02135c00-5c50-45c9-a206-85f9e60d9c6e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801828 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-dns-svc\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.801865 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffwgj\" (UniqueName: \"kubernetes.io/projected/afc78685-0e55-47af-9fd4-d800f0779296-kube-api-access-ffwgj\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.806625 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.807281 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.807292 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.808298 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02135c00-5c50-45c9-a206-85f9e60d9c6e-config\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.808482 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.808624 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02135c00-5c50-45c9-a206-85f9e60d9c6e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.813566 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02135c00-5c50-45c9-a206-85f9e60d9c6e-scripts\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.816213 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-dns-svc\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.818548 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k9lb\" (UniqueName: \"kubernetes.io/projected/02135c00-5c50-45c9-a206-85f9e60d9c6e-kube-api-access-4k9lb\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.819586 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/02135c00-5c50-45c9-a206-85f9e60d9c6e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"02135c00-5c50-45c9-a206-85f9e60d9c6e\") " pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.822698 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffwgj\" (UniqueName: \"kubernetes.io/projected/afc78685-0e55-47af-9fd4-d800f0779296-kube-api-access-ffwgj\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.823096 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-config\") pod \"dnsmasq-dns-8554648995-88mf7\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.869332 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-kgbvn" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.910160 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.946272 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.959167 4803 generic.go:334] "Generic (PLEG): container finished" podID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" containerID="fe0dd3bde21dad01158d94818b90e69352faf707bed9dbc343c031fda9784ed2" exitCode=0 Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.959249 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" event={"ID":"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc","Type":"ContainerDied","Data":"fe0dd3bde21dad01158d94818b90e69352faf707bed9dbc343c031fda9784ed2"} Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.961446 4803 generic.go:334] "Generic (PLEG): container finished" podID="f122552a-098b-4046-9a7b-82f89b7aeed7" containerID="c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4" exitCode=0 Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.962229 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.963756 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" event={"ID":"f122552a-098b-4046-9a7b-82f89b7aeed7","Type":"ContainerDied","Data":"c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4"} Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.963811 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gw794" event={"ID":"f122552a-098b-4046-9a7b-82f89b7aeed7","Type":"ContainerDied","Data":"1f06f54baef0292a8b0fb2ac5463fc2c7fed72120a8a366039097e0372813d2d"} Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.963831 4803 scope.go:117] "RemoveContainer" containerID="c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4" Mar 20 17:33:20 crc kubenswrapper[4803]: I0320 17:33:20.981010 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.012704 4803 scope.go:117] "RemoveContainer" containerID="ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.032641 4803 scope.go:117] "RemoveContainer" containerID="c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4" Mar 20 17:33:21 crc kubenswrapper[4803]: E0320 17:33:21.033382 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4\": container with ID starting with c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4 not found: ID does not exist" containerID="c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.033424 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4"} err="failed to get container status \"c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4\": rpc error: code = NotFound desc = could not find container \"c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4\": container with ID starting with c3255915ebacee4ba965121d290d4f4f6661e85c9205ae16cc388ca9f797ddf4 not found: ID does not exist" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.033462 4803 scope.go:117] "RemoveContainer" containerID="ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370" Mar 20 17:33:21 crc kubenswrapper[4803]: E0320 17:33:21.033722 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370\": container with ID starting with ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370 not found: ID does not exist" containerID="ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.033742 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370"} err="failed to get container status \"ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370\": rpc error: code = NotFound desc = could not find container \"ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370\": container with ID starting with ce14afadd3690741fa3e9fd1aee1126c3d1ee5c4641595aeef7258358d20f370 not found: ID does not exist" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.086997 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.110094 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-config\") pod \"f122552a-098b-4046-9a7b-82f89b7aeed7\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.110133 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k77l\" (UniqueName: \"kubernetes.io/projected/f122552a-098b-4046-9a7b-82f89b7aeed7-kube-api-access-8k77l\") pod \"f122552a-098b-4046-9a7b-82f89b7aeed7\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.110207 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-dns-svc\") pod \"f122552a-098b-4046-9a7b-82f89b7aeed7\" (UID: \"f122552a-098b-4046-9a7b-82f89b7aeed7\") " Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.114387 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f122552a-098b-4046-9a7b-82f89b7aeed7-kube-api-access-8k77l" (OuterVolumeSpecName: "kube-api-access-8k77l") pod "f122552a-098b-4046-9a7b-82f89b7aeed7" (UID: "f122552a-098b-4046-9a7b-82f89b7aeed7"). InnerVolumeSpecName "kube-api-access-8k77l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.146541 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f122552a-098b-4046-9a7b-82f89b7aeed7" (UID: "f122552a-098b-4046-9a7b-82f89b7aeed7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.146601 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-config" (OuterVolumeSpecName: "config") pod "f122552a-098b-4046-9a7b-82f89b7aeed7" (UID: "f122552a-098b-4046-9a7b-82f89b7aeed7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.211640 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-dns-svc\") pod \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.211764 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5vpz\" (UniqueName: \"kubernetes.io/projected/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-kube-api-access-g5vpz\") pod \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.211805 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-config\") pod \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\" (UID: \"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc\") " Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.212100 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.212112 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k77l\" (UniqueName: \"kubernetes.io/projected/f122552a-098b-4046-9a7b-82f89b7aeed7-kube-api-access-8k77l\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.212122 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f122552a-098b-4046-9a7b-82f89b7aeed7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.216171 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-kube-api-access-g5vpz" (OuterVolumeSpecName: "kube-api-access-g5vpz") pod "733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" (UID: "733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc"). InnerVolumeSpecName "kube-api-access-g5vpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.246886 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-config" (OuterVolumeSpecName: "config") pod "733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" (UID: "733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.267875 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" (UID: "733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.298086 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gw794"] Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.304515 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gw794"] Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.313805 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5vpz\" (UniqueName: \"kubernetes.io/projected/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-kube-api-access-g5vpz\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.313841 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.313850 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.350943 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-cjgnb"] Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.439898 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-kgbvn"] Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.451007 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.522869 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-88mf7"] Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.673191 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.673465 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.900128 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.969272 4803 generic.go:334] "Generic (PLEG): container finished" podID="afc78685-0e55-47af-9fd4-d800f0779296" containerID="bba264b2d8d5f7f446aabec57859dc52c0dd1ac49814cc0246acb823e0edfebe" exitCode=0 Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.969361 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-88mf7" event={"ID":"afc78685-0e55-47af-9fd4-d800f0779296","Type":"ContainerDied","Data":"bba264b2d8d5f7f446aabec57859dc52c0dd1ac49814cc0246acb823e0edfebe"} Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.969399 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-88mf7" event={"ID":"afc78685-0e55-47af-9fd4-d800f0779296","Type":"ContainerStarted","Data":"3886f7f012633d67425ed6fb9f505f95d6f5be0b6730323335c9d85bb1d97ffa"} Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.974116 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.976571 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-pflx7" event={"ID":"733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc","Type":"ContainerDied","Data":"03906fc99af8c988f89975c2cfa182910480410662f6db466782262359ac8eb3"} Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.976624 4803 scope.go:117] "RemoveContainer" containerID="fe0dd3bde21dad01158d94818b90e69352faf707bed9dbc343c031fda9784ed2" Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.999802 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kgbvn" event={"ID":"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05","Type":"ContainerStarted","Data":"54d319c357cc8561f453de87d27c9085a52479a0e8b6dd02874ff8744e66c1c2"} Mar 20 17:33:21 crc kubenswrapper[4803]: I0320 17:33:21.999844 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-kgbvn" event={"ID":"7cc9f89e-5d0f-4f92-93fa-c7a0133baf05","Type":"ContainerStarted","Data":"aeddbc8de0b4ba1cff03515223eb0261de5f5807afe48432c8c012c6f5ef0791"} Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.004560 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02135c00-5c50-45c9-a206-85f9e60d9c6e","Type":"ContainerStarted","Data":"2ca04816149b2cf122974fb0fca089749e8c30e66e513a7380b538646b9cda3a"} Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.005728 4803 generic.go:334] "Generic (PLEG): container finished" podID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" containerID="8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50" exitCode=0 Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.005930 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" event={"ID":"1ef64a31-3daf-49cb-8f1a-688e8e6991f6","Type":"ContainerDied","Data":"8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50"} Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.005970 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" event={"ID":"1ef64a31-3daf-49cb-8f1a-688e8e6991f6","Type":"ContainerStarted","Data":"98ae1b11828d403d5a241411b299901962bc772130906240fefee90c70b5c259"} Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.027106 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-kgbvn" podStartSLOduration=2.027087376 podStartE2EDuration="2.027087376s" podCreationTimestamp="2026-03-20 17:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:22.021253278 +0000 UTC m=+1011.932845348" watchObservedRunningTime="2026-03-20 17:33:22.027087376 +0000 UTC m=+1011.938679466" Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.041026 4803 scope.go:117] "RemoveContainer" containerID="803a37cf1277419dbb96bac0c257d5eb59f7f7af3a1e1e54a4e638a129d7bc13" Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.119063 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pflx7"] Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.124998 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-pflx7"] Mar 20 17:33:22 crc kubenswrapper[4803]: E0320 17:33:22.304403 4803 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.75:58056->38.102.83.75:40209: write tcp 38.102.83.75:58056->38.102.83.75:40209: write: broken pipe Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.865800 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" path="/var/lib/kubelet/pods/733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc/volumes" Mar 20 17:33:22 crc kubenswrapper[4803]: I0320 17:33:22.867088 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f122552a-098b-4046-9a7b-82f89b7aeed7" path="/var/lib/kubelet/pods/f122552a-098b-4046-9a7b-82f89b7aeed7/volumes" Mar 20 17:33:23 crc kubenswrapper[4803]: I0320 17:33:23.017663 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02135c00-5c50-45c9-a206-85f9e60d9c6e","Type":"ContainerStarted","Data":"dba137b47af6af523d7c1e452de74bab94cb02c6858820e4212a9e8a6688fa8f"} Mar 20 17:33:23 crc kubenswrapper[4803]: I0320 17:33:23.019175 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" event={"ID":"1ef64a31-3daf-49cb-8f1a-688e8e6991f6","Type":"ContainerStarted","Data":"631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8"} Mar 20 17:33:23 crc kubenswrapper[4803]: I0320 17:33:23.019975 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:23 crc kubenswrapper[4803]: I0320 17:33:23.022379 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-88mf7" event={"ID":"afc78685-0e55-47af-9fd4-d800f0779296","Type":"ContainerStarted","Data":"71271357f6b7072ec5fce46d5f0262d677141ba01d0f8d3c4dad4253d184b332"} Mar 20 17:33:23 crc kubenswrapper[4803]: I0320 17:33:23.022509 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:23 crc kubenswrapper[4803]: I0320 17:33:23.038208 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" podStartSLOduration=3.038176071 podStartE2EDuration="3.038176071s" podCreationTimestamp="2026-03-20 17:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:23.035912255 +0000 UTC m=+1012.947504325" watchObservedRunningTime="2026-03-20 17:33:23.038176071 +0000 UTC m=+1012.949768171" Mar 20 17:33:23 crc kubenswrapper[4803]: I0320 17:33:23.056678 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-88mf7" podStartSLOduration=3.056658503 podStartE2EDuration="3.056658503s" podCreationTimestamp="2026-03-20 17:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:23.051613288 +0000 UTC m=+1012.963205358" watchObservedRunningTime="2026-03-20 17:33:23.056658503 +0000 UTC m=+1012.968250583" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.046794 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"02135c00-5c50-45c9-a206-85f9e60d9c6e","Type":"ContainerStarted","Data":"cb78dc8f8a4aa83810feedb071e40b6c11fcb09b1eb34819f03b616bf1694104"} Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.078067 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.788858538 podStartE2EDuration="4.078038625s" podCreationTimestamp="2026-03-20 17:33:20 +0000 UTC" firstStartedPulling="2026-03-20 17:33:21.467635576 +0000 UTC m=+1011.379227646" lastFinishedPulling="2026-03-20 17:33:22.756815663 +0000 UTC m=+1012.668407733" observedRunningTime="2026-03-20 17:33:24.070089586 +0000 UTC m=+1013.981681686" watchObservedRunningTime="2026-03-20 17:33:24.078038625 +0000 UTC m=+1013.989630705" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.169457 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.293022 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-cjgnb"] Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.308158 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.345982 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lpnmp"] Mar 20 17:33:24 crc kubenswrapper[4803]: E0320 17:33:24.346338 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f122552a-098b-4046-9a7b-82f89b7aeed7" containerName="init" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.346354 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f122552a-098b-4046-9a7b-82f89b7aeed7" containerName="init" Mar 20 17:33:24 crc kubenswrapper[4803]: E0320 17:33:24.346384 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" containerName="init" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.346393 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" containerName="init" Mar 20 17:33:24 crc kubenswrapper[4803]: E0320 17:33:24.346410 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f122552a-098b-4046-9a7b-82f89b7aeed7" containerName="dnsmasq-dns" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.346416 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f122552a-098b-4046-9a7b-82f89b7aeed7" containerName="dnsmasq-dns" Mar 20 17:33:24 crc kubenswrapper[4803]: E0320 17:33:24.346441 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" containerName="dnsmasq-dns" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.346449 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" containerName="dnsmasq-dns" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.346613 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f122552a-098b-4046-9a7b-82f89b7aeed7" containerName="dnsmasq-dns" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.346636 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="733fdcd1-a9fa-4454-a2dc-c6c51c5ceebc" containerName="dnsmasq-dns" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.347435 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.356578 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.372344 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-config\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.372447 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgp7q\" (UniqueName: \"kubernetes.io/projected/87faf158-6b0e-491b-bb0e-0d9a7497290a-kube-api-access-tgp7q\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.372468 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.372495 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.372534 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.372951 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lpnmp"] Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.473955 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgp7q\" (UniqueName: \"kubernetes.io/projected/87faf158-6b0e-491b-bb0e-0d9a7497290a-kube-api-access-tgp7q\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.474002 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.474045 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.474076 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.474128 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-config\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.474902 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.474971 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.475027 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-config\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.475106 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.493229 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgp7q\" (UniqueName: \"kubernetes.io/projected/87faf158-6b0e-491b-bb0e-0d9a7497290a-kube-api-access-tgp7q\") pod \"dnsmasq-dns-b8fbc5445-lpnmp\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:24 crc kubenswrapper[4803]: I0320 17:33:24.662928 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.054781 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 17:33:25 crc kubenswrapper[4803]: W0320 17:33:25.187000 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87faf158_6b0e_491b_bb0e_0d9a7497290a.slice/crio-6c9e6c800d8c53eaba15f25c29689660612862844be41b67ea73b8e7a6256a73 WatchSource:0}: Error finding container 6c9e6c800d8c53eaba15f25c29689660612862844be41b67ea73b8e7a6256a73: Status 404 returned error can't find the container with id 6c9e6c800d8c53eaba15f25c29689660612862844be41b67ea73b8e7a6256a73 Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.185507 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lpnmp"] Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.406556 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.412491 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.417599 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.417701 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.417998 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-cm4sd" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.419714 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.443382 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.602683 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpw2z\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-kube-api-access-zpw2z\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.602749 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.602786 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.602852 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-lock\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.602930 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-cache\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.603000 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.704338 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-lock\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.704397 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-cache\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.704458 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.704637 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpw2z\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-kube-api-access-zpw2z\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.704680 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.704721 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.705155 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: E0320 17:33:25.705514 4803 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:33:25 crc kubenswrapper[4803]: E0320 17:33:25.705557 4803 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:33:25 crc kubenswrapper[4803]: E0320 17:33:25.705606 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift podName:8d6ece75-fa3d-4695-ada9-6c5ec4b580a7 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:26.205589383 +0000 UTC m=+1016.117181453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift") pod "swift-storage-0" (UID: "8d6ece75-fa3d-4695-ada9-6c5ec4b580a7") : configmap "swift-ring-files" not found Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.706170 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-lock\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.707035 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-cache\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.712310 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.729427 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpw2z\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-kube-api-access-zpw2z\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:25 crc kubenswrapper[4803]: I0320 17:33:25.731629 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.069435 4803 generic.go:334] "Generic (PLEG): container finished" podID="87faf158-6b0e-491b-bb0e-0d9a7497290a" containerID="bb401b89fe039681af9e43d77d3b549292fabea4c919713f56aa391f3bb575b0" exitCode=0 Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.069510 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" event={"ID":"87faf158-6b0e-491b-bb0e-0d9a7497290a","Type":"ContainerDied","Data":"bb401b89fe039681af9e43d77d3b549292fabea4c919713f56aa391f3bb575b0"} Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.069885 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" event={"ID":"87faf158-6b0e-491b-bb0e-0d9a7497290a","Type":"ContainerStarted","Data":"6c9e6c800d8c53eaba15f25c29689660612862844be41b67ea73b8e7a6256a73"} Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.070107 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" podUID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" containerName="dnsmasq-dns" containerID="cri-o://631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8" gracePeriod=10 Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.214862 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:26 crc kubenswrapper[4803]: E0320 17:33:26.215017 4803 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:33:26 crc kubenswrapper[4803]: E0320 17:33:26.215226 4803 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:33:26 crc kubenswrapper[4803]: E0320 17:33:26.215359 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift podName:8d6ece75-fa3d-4695-ada9-6c5ec4b580a7 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:27.215338881 +0000 UTC m=+1017.126931161 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift") pod "swift-storage-0" (UID: "8d6ece75-fa3d-4695-ada9-6c5ec4b580a7") : configmap "swift-ring-files" not found Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.364404 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.467832 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.572215 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.726746 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-ovsdbserver-sb\") pod \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.726794 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbr9b\" (UniqueName: \"kubernetes.io/projected/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-kube-api-access-wbr9b\") pod \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.726853 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-dns-svc\") pod \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.726952 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-config\") pod \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\" (UID: \"1ef64a31-3daf-49cb-8f1a-688e8e6991f6\") " Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.735405 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-kube-api-access-wbr9b" (OuterVolumeSpecName: "kube-api-access-wbr9b") pod "1ef64a31-3daf-49cb-8f1a-688e8e6991f6" (UID: "1ef64a31-3daf-49cb-8f1a-688e8e6991f6"). InnerVolumeSpecName "kube-api-access-wbr9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.756983 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-config" (OuterVolumeSpecName: "config") pod "1ef64a31-3daf-49cb-8f1a-688e8e6991f6" (UID: "1ef64a31-3daf-49cb-8f1a-688e8e6991f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.762172 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ef64a31-3daf-49cb-8f1a-688e8e6991f6" (UID: "1ef64a31-3daf-49cb-8f1a-688e8e6991f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.765909 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1ef64a31-3daf-49cb-8f1a-688e8e6991f6" (UID: "1ef64a31-3daf-49cb-8f1a-688e8e6991f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.830043 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.830084 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbr9b\" (UniqueName: \"kubernetes.io/projected/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-kube-api-access-wbr9b\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.830096 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:26 crc kubenswrapper[4803]: I0320 17:33:26.830105 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef64a31-3daf-49cb-8f1a-688e8e6991f6-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.082226 4803 generic.go:334] "Generic (PLEG): container finished" podID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" containerID="631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8" exitCode=0 Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.082308 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" event={"ID":"1ef64a31-3daf-49cb-8f1a-688e8e6991f6","Type":"ContainerDied","Data":"631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8"} Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.082346 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.082372 4803 scope.go:117] "RemoveContainer" containerID="631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.082357 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-cjgnb" event={"ID":"1ef64a31-3daf-49cb-8f1a-688e8e6991f6","Type":"ContainerDied","Data":"98ae1b11828d403d5a241411b299901962bc772130906240fefee90c70b5c259"} Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.086841 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" event={"ID":"87faf158-6b0e-491b-bb0e-0d9a7497290a","Type":"ContainerStarted","Data":"013e259e785c8f3437ac6a9e1c3951e0053a1751848d23395a70804341d59b57"} Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.086951 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.135804 4803 scope.go:117] "RemoveContainer" containerID="8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.139198 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" podStartSLOduration=3.139183542 podStartE2EDuration="3.139183542s" podCreationTimestamp="2026-03-20 17:33:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:27.114555293 +0000 UTC m=+1017.026147413" watchObservedRunningTime="2026-03-20 17:33:27.139183542 +0000 UTC m=+1017.050775622" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.142213 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-cjgnb"] Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.156622 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-cjgnb"] Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.173846 4803 scope.go:117] "RemoveContainer" containerID="631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8" Mar 20 17:33:27 crc kubenswrapper[4803]: E0320 17:33:27.174852 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8\": container with ID starting with 631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8 not found: ID does not exist" containerID="631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.174901 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8"} err="failed to get container status \"631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8\": rpc error: code = NotFound desc = could not find container \"631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8\": container with ID starting with 631532e549a08af343e988c49216cc5ca73d3ff073ccc9beddfd6b1a548326f8 not found: ID does not exist" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.174937 4803 scope.go:117] "RemoveContainer" containerID="8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50" Mar 20 17:33:27 crc kubenswrapper[4803]: E0320 17:33:27.176014 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50\": container with ID starting with 8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50 not found: ID does not exist" containerID="8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.176095 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50"} err="failed to get container status \"8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50\": rpc error: code = NotFound desc = could not find container \"8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50\": container with ID starting with 8e0aae3d3ea4e1338cf4ab4014f9ad168433b8a1bef802a89f18baf1efbaac50 not found: ID does not exist" Mar 20 17:33:27 crc kubenswrapper[4803]: I0320 17:33:27.237655 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:27 crc kubenswrapper[4803]: E0320 17:33:27.237999 4803 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:33:27 crc kubenswrapper[4803]: E0320 17:33:27.238097 4803 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:33:27 crc kubenswrapper[4803]: E0320 17:33:27.238225 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift podName:8d6ece75-fa3d-4695-ada9-6c5ec4b580a7 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:29.238192625 +0000 UTC m=+1019.149784755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift") pod "swift-storage-0" (UID: "8d6ece75-fa3d-4695-ada9-6c5ec4b580a7") : configmap "swift-ring-files" not found Mar 20 17:33:28 crc kubenswrapper[4803]: I0320 17:33:28.866844 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" path="/var/lib/kubelet/pods/1ef64a31-3daf-49cb-8f1a-688e8e6991f6/volumes" Mar 20 17:33:28 crc kubenswrapper[4803]: I0320 17:33:28.963277 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-59vqb"] Mar 20 17:33:28 crc kubenswrapper[4803]: E0320 17:33:28.963794 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" containerName="dnsmasq-dns" Mar 20 17:33:28 crc kubenswrapper[4803]: I0320 17:33:28.963824 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" containerName="dnsmasq-dns" Mar 20 17:33:28 crc kubenswrapper[4803]: E0320 17:33:28.963860 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" containerName="init" Mar 20 17:33:28 crc kubenswrapper[4803]: I0320 17:33:28.963872 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" containerName="init" Mar 20 17:33:28 crc kubenswrapper[4803]: I0320 17:33:28.964178 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef64a31-3daf-49cb-8f1a-688e8e6991f6" containerName="dnsmasq-dns" Mar 20 17:33:28 crc kubenswrapper[4803]: I0320 17:33:28.965016 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:28 crc kubenswrapper[4803]: I0320 17:33:28.969558 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 17:33:28 crc kubenswrapper[4803]: I0320 17:33:28.980020 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-59vqb"] Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.085275 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8htlr\" (UniqueName: \"kubernetes.io/projected/8f3c5d60-6930-4e05-8188-dd1c237d948f-kube-api-access-8htlr\") pod \"root-account-create-update-59vqb\" (UID: \"8f3c5d60-6930-4e05-8188-dd1c237d948f\") " pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.085435 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f3c5d60-6930-4e05-8188-dd1c237d948f-operator-scripts\") pod \"root-account-create-update-59vqb\" (UID: \"8f3c5d60-6930-4e05-8188-dd1c237d948f\") " pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.187380 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f3c5d60-6930-4e05-8188-dd1c237d948f-operator-scripts\") pod \"root-account-create-update-59vqb\" (UID: \"8f3c5d60-6930-4e05-8188-dd1c237d948f\") " pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.187670 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8htlr\" (UniqueName: \"kubernetes.io/projected/8f3c5d60-6930-4e05-8188-dd1c237d948f-kube-api-access-8htlr\") pod \"root-account-create-update-59vqb\" (UID: \"8f3c5d60-6930-4e05-8188-dd1c237d948f\") " pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.188808 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f3c5d60-6930-4e05-8188-dd1c237d948f-operator-scripts\") pod \"root-account-create-update-59vqb\" (UID: \"8f3c5d60-6930-4e05-8188-dd1c237d948f\") " pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.216648 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8htlr\" (UniqueName: \"kubernetes.io/projected/8f3c5d60-6930-4e05-8188-dd1c237d948f-kube-api-access-8htlr\") pod \"root-account-create-update-59vqb\" (UID: \"8f3c5d60-6930-4e05-8188-dd1c237d948f\") " pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.289422 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:29 crc kubenswrapper[4803]: E0320 17:33:29.289727 4803 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:33:29 crc kubenswrapper[4803]: E0320 17:33:29.289781 4803 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:33:29 crc kubenswrapper[4803]: E0320 17:33:29.289900 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift podName:8d6ece75-fa3d-4695-ada9-6c5ec4b580a7 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:33.289866145 +0000 UTC m=+1023.201458265 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift") pod "swift-storage-0" (UID: "8d6ece75-fa3d-4695-ada9-6c5ec4b580a7") : configmap "swift-ring-files" not found Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.322728 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.394569 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9tqhv"] Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.400081 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.402948 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.402974 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.404814 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.427867 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9tqhv"] Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.467754 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9tqhv"] Mar 20 17:33:29 crc kubenswrapper[4803]: E0320 17:33:29.470007 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-56cbl ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/swift-ring-rebalance-9tqhv" podUID="97429232-f243-4f9d-b121-8c5fc03d6807" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.477775 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2d9wv"] Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.484913 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.492579 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-ring-data-devices\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.493836 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-scripts\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.493925 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56cbl\" (UniqueName: \"kubernetes.io/projected/97429232-f243-4f9d-b121-8c5fc03d6807-kube-api-access-56cbl\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.494025 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-combined-ca-bundle\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.494104 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-swiftconf\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.494193 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97429232-f243-4f9d-b121-8c5fc03d6807-etc-swift\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.494285 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-dispersionconf\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.495805 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2d9wv"] Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596215 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-ring-data-devices\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596469 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-scripts\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596490 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56cbl\" (UniqueName: \"kubernetes.io/projected/97429232-f243-4f9d-b121-8c5fc03d6807-kube-api-access-56cbl\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596515 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trv8v\" (UniqueName: \"kubernetes.io/projected/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-kube-api-access-trv8v\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596582 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-ring-data-devices\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596614 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-dispersionconf\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596632 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-combined-ca-bundle\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596648 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-combined-ca-bundle\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596667 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-swiftconf\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596691 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97429232-f243-4f9d-b121-8c5fc03d6807-etc-swift\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596711 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-scripts\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596728 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-etc-swift\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596755 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-dispersionconf\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.596792 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-swiftconf\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.597605 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97429232-f243-4f9d-b121-8c5fc03d6807-etc-swift\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.598306 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-ring-data-devices\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.598391 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-scripts\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.602436 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-swiftconf\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.602499 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-combined-ca-bundle\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.604476 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-dispersionconf\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.617388 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56cbl\" (UniqueName: \"kubernetes.io/projected/97429232-f243-4f9d-b121-8c5fc03d6807-kube-api-access-56cbl\") pod \"swift-ring-rebalance-9tqhv\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.698280 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-combined-ca-bundle\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.698383 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-scripts\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.698430 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-etc-swift\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.698500 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-swiftconf\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.698587 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trv8v\" (UniqueName: \"kubernetes.io/projected/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-kube-api-access-trv8v\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.698650 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-ring-data-devices\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.698673 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-dispersionconf\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.699462 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-etc-swift\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.700371 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-scripts\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.700678 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-ring-data-devices\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.703067 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-dispersionconf\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.706722 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-combined-ca-bundle\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.709037 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-swiftconf\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.725312 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trv8v\" (UniqueName: \"kubernetes.io/projected/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-kube-api-access-trv8v\") pod \"swift-ring-rebalance-2d9wv\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.818926 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:29 crc kubenswrapper[4803]: I0320 17:33:29.849718 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-59vqb"] Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.124094 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.125567 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-59vqb" event={"ID":"8f3c5d60-6930-4e05-8188-dd1c237d948f","Type":"ContainerStarted","Data":"dd85f501e501030ecbc8e3b2457ac08842ebe17167f2f2597a29001e781549c5"} Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.125597 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-59vqb" event={"ID":"8f3c5d60-6930-4e05-8188-dd1c237d948f","Type":"ContainerStarted","Data":"7e85caa3f307b049bfc13a61397557f886f3f31575daeb65ea9aeb964ca68a13"} Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.155540 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-59vqb" podStartSLOduration=2.155515029 podStartE2EDuration="2.155515029s" podCreationTimestamp="2026-03-20 17:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:30.153322835 +0000 UTC m=+1020.064914905" watchObservedRunningTime="2026-03-20 17:33:30.155515029 +0000 UTC m=+1020.067107099" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.156115 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.214798 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97429232-f243-4f9d-b121-8c5fc03d6807-etc-swift\") pod \"97429232-f243-4f9d-b121-8c5fc03d6807\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.214851 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-dispersionconf\") pod \"97429232-f243-4f9d-b121-8c5fc03d6807\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.214877 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-scripts\") pod \"97429232-f243-4f9d-b121-8c5fc03d6807\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.214932 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-ring-data-devices\") pod \"97429232-f243-4f9d-b121-8c5fc03d6807\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.214950 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-swiftconf\") pod \"97429232-f243-4f9d-b121-8c5fc03d6807\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.214971 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56cbl\" (UniqueName: \"kubernetes.io/projected/97429232-f243-4f9d-b121-8c5fc03d6807-kube-api-access-56cbl\") pod \"97429232-f243-4f9d-b121-8c5fc03d6807\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.215038 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-combined-ca-bundle\") pod \"97429232-f243-4f9d-b121-8c5fc03d6807\" (UID: \"97429232-f243-4f9d-b121-8c5fc03d6807\") " Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.216057 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97429232-f243-4f9d-b121-8c5fc03d6807-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "97429232-f243-4f9d-b121-8c5fc03d6807" (UID: "97429232-f243-4f9d-b121-8c5fc03d6807"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.222594 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "97429232-f243-4f9d-b121-8c5fc03d6807" (UID: "97429232-f243-4f9d-b121-8c5fc03d6807"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.232150 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2d9wv"] Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.233379 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-scripts" (OuterVolumeSpecName: "scripts") pod "97429232-f243-4f9d-b121-8c5fc03d6807" (UID: "97429232-f243-4f9d-b121-8c5fc03d6807"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.234623 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "97429232-f243-4f9d-b121-8c5fc03d6807" (UID: "97429232-f243-4f9d-b121-8c5fc03d6807"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.234674 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "97429232-f243-4f9d-b121-8c5fc03d6807" (UID: "97429232-f243-4f9d-b121-8c5fc03d6807"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.234699 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97429232-f243-4f9d-b121-8c5fc03d6807-kube-api-access-56cbl" (OuterVolumeSpecName: "kube-api-access-56cbl") pod "97429232-f243-4f9d-b121-8c5fc03d6807" (UID: "97429232-f243-4f9d-b121-8c5fc03d6807"). InnerVolumeSpecName "kube-api-access-56cbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.234707 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97429232-f243-4f9d-b121-8c5fc03d6807" (UID: "97429232-f243-4f9d-b121-8c5fc03d6807"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.316293 4803 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.316320 4803 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.316330 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56cbl\" (UniqueName: \"kubernetes.io/projected/97429232-f243-4f9d-b121-8c5fc03d6807-kube-api-access-56cbl\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.316340 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.316349 4803 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/97429232-f243-4f9d-b121-8c5fc03d6807-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.316356 4803 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/97429232-f243-4f9d-b121-8c5fc03d6807-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.316364 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97429232-f243-4f9d-b121-8c5fc03d6807-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:30 crc kubenswrapper[4803]: I0320 17:33:30.981917 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:31 crc kubenswrapper[4803]: I0320 17:33:31.142425 4803 generic.go:334] "Generic (PLEG): container finished" podID="8f3c5d60-6930-4e05-8188-dd1c237d948f" containerID="dd85f501e501030ecbc8e3b2457ac08842ebe17167f2f2597a29001e781549c5" exitCode=0 Mar 20 17:33:31 crc kubenswrapper[4803]: I0320 17:33:31.142492 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-59vqb" event={"ID":"8f3c5d60-6930-4e05-8188-dd1c237d948f","Type":"ContainerDied","Data":"dd85f501e501030ecbc8e3b2457ac08842ebe17167f2f2597a29001e781549c5"} Mar 20 17:33:31 crc kubenswrapper[4803]: I0320 17:33:31.148094 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9tqhv" Mar 20 17:33:31 crc kubenswrapper[4803]: I0320 17:33:31.148394 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2d9wv" event={"ID":"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda","Type":"ContainerStarted","Data":"e64481ee232fca76b5c09407b0852e3d683266405a78ed5f6814314918e36b8f"} Mar 20 17:33:31 crc kubenswrapper[4803]: I0320 17:33:31.208002 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9tqhv"] Mar 20 17:33:31 crc kubenswrapper[4803]: I0320 17:33:31.233484 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9tqhv"] Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.056180 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4c46w"] Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.065279 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4c46w" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.071801 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4c46w"] Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.151311 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvdk\" (UniqueName: \"kubernetes.io/projected/95eec546-fa73-4572-8833-a92c4c020052-kube-api-access-wkvdk\") pod \"glance-db-create-4c46w\" (UID: \"95eec546-fa73-4572-8833-a92c4c020052\") " pod="openstack/glance-db-create-4c46w" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.151370 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95eec546-fa73-4572-8833-a92c4c020052-operator-scripts\") pod \"glance-db-create-4c46w\" (UID: \"95eec546-fa73-4572-8833-a92c4c020052\") " pod="openstack/glance-db-create-4c46w" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.172864 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3d78-account-create-update-7dwj2"] Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.174018 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.180219 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.184601 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3d78-account-create-update-7dwj2"] Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.251804 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95eec546-fa73-4572-8833-a92c4c020052-operator-scripts\") pod \"glance-db-create-4c46w\" (UID: \"95eec546-fa73-4572-8833-a92c4c020052\") " pod="openstack/glance-db-create-4c46w" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.251848 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srzjb\" (UniqueName: \"kubernetes.io/projected/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-kube-api-access-srzjb\") pod \"glance-3d78-account-create-update-7dwj2\" (UID: \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\") " pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.251939 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-operator-scripts\") pod \"glance-3d78-account-create-update-7dwj2\" (UID: \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\") " pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.251967 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvdk\" (UniqueName: \"kubernetes.io/projected/95eec546-fa73-4572-8833-a92c4c020052-kube-api-access-wkvdk\") pod \"glance-db-create-4c46w\" (UID: \"95eec546-fa73-4572-8833-a92c4c020052\") " pod="openstack/glance-db-create-4c46w" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.253236 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95eec546-fa73-4572-8833-a92c4c020052-operator-scripts\") pod \"glance-db-create-4c46w\" (UID: \"95eec546-fa73-4572-8833-a92c4c020052\") " pod="openstack/glance-db-create-4c46w" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.298696 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvdk\" (UniqueName: \"kubernetes.io/projected/95eec546-fa73-4572-8833-a92c4c020052-kube-api-access-wkvdk\") pod \"glance-db-create-4c46w\" (UID: \"95eec546-fa73-4572-8833-a92c4c020052\") " pod="openstack/glance-db-create-4c46w" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.353853 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srzjb\" (UniqueName: \"kubernetes.io/projected/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-kube-api-access-srzjb\") pod \"glance-3d78-account-create-update-7dwj2\" (UID: \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\") " pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.353986 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-operator-scripts\") pod \"glance-3d78-account-create-update-7dwj2\" (UID: \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\") " pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.354772 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-operator-scripts\") pod \"glance-3d78-account-create-update-7dwj2\" (UID: \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\") " pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.375586 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srzjb\" (UniqueName: \"kubernetes.io/projected/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-kube-api-access-srzjb\") pod \"glance-3d78-account-create-update-7dwj2\" (UID: \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\") " pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.396074 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4c46w" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.498983 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.860721 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97429232-f243-4f9d-b121-8c5fc03d6807" path="/var/lib/kubelet/pods/97429232-f243-4f9d-b121-8c5fc03d6807/volumes" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.911114 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ht9gr"] Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.914123 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.925419 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ht9gr"] Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.967340 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgzt2\" (UniqueName: \"kubernetes.io/projected/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-kube-api-access-lgzt2\") pod \"keystone-db-create-ht9gr\" (UID: \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\") " pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:32 crc kubenswrapper[4803]: I0320 17:33:32.967426 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-operator-scripts\") pod \"keystone-db-create-ht9gr\" (UID: \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\") " pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.016654 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-91c6-account-create-update-pvxqz"] Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.017752 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.019271 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.029944 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-91c6-account-create-update-pvxqz"] Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.073169 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-operator-scripts\") pod \"keystone-91c6-account-create-update-pvxqz\" (UID: \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\") " pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.073244 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgzt2\" (UniqueName: \"kubernetes.io/projected/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-kube-api-access-lgzt2\") pod \"keystone-db-create-ht9gr\" (UID: \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\") " pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.073276 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kktz7\" (UniqueName: \"kubernetes.io/projected/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-kube-api-access-kktz7\") pod \"keystone-91c6-account-create-update-pvxqz\" (UID: \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\") " pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.073302 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-operator-scripts\") pod \"keystone-db-create-ht9gr\" (UID: \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\") " pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.074014 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-operator-scripts\") pod \"keystone-db-create-ht9gr\" (UID: \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\") " pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.113806 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgzt2\" (UniqueName: \"kubernetes.io/projected/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-kube-api-access-lgzt2\") pod \"keystone-db-create-ht9gr\" (UID: \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\") " pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.159449 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kxd5z"] Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.160696 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.174348 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c33e7b8-552f-4505-ab8d-40fe6c121314-operator-scripts\") pod \"placement-db-create-kxd5z\" (UID: \"0c33e7b8-552f-4505-ab8d-40fe6c121314\") " pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.174416 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx7s9\" (UniqueName: \"kubernetes.io/projected/0c33e7b8-552f-4505-ab8d-40fe6c121314-kube-api-access-xx7s9\") pod \"placement-db-create-kxd5z\" (UID: \"0c33e7b8-552f-4505-ab8d-40fe6c121314\") " pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.174451 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-operator-scripts\") pod \"keystone-91c6-account-create-update-pvxqz\" (UID: \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\") " pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.174505 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kktz7\" (UniqueName: \"kubernetes.io/projected/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-kube-api-access-kktz7\") pod \"keystone-91c6-account-create-update-pvxqz\" (UID: \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\") " pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.175336 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-operator-scripts\") pod \"keystone-91c6-account-create-update-pvxqz\" (UID: \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\") " pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.186194 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kxd5z"] Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.220040 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kktz7\" (UniqueName: \"kubernetes.io/projected/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-kube-api-access-kktz7\") pod \"keystone-91c6-account-create-update-pvxqz\" (UID: \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\") " pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.232463 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.272176 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ea93-account-create-update-k5pbr"] Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.273753 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.276191 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.276567 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx7s9\" (UniqueName: \"kubernetes.io/projected/0c33e7b8-552f-4505-ab8d-40fe6c121314-kube-api-access-xx7s9\") pod \"placement-db-create-kxd5z\" (UID: \"0c33e7b8-552f-4505-ab8d-40fe6c121314\") " pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.276686 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c33e7b8-552f-4505-ab8d-40fe6c121314-operator-scripts\") pod \"placement-db-create-kxd5z\" (UID: \"0c33e7b8-552f-4505-ab8d-40fe6c121314\") " pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.277382 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c33e7b8-552f-4505-ab8d-40fe6c121314-operator-scripts\") pod \"placement-db-create-kxd5z\" (UID: \"0c33e7b8-552f-4505-ab8d-40fe6c121314\") " pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.280770 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ea93-account-create-update-k5pbr"] Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.300508 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx7s9\" (UniqueName: \"kubernetes.io/projected/0c33e7b8-552f-4505-ab8d-40fe6c121314-kube-api-access-xx7s9\") pod \"placement-db-create-kxd5z\" (UID: \"0c33e7b8-552f-4505-ab8d-40fe6c121314\") " pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.338022 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.377390 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.377453 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64a509f5-e0df-432f-aa48-1c59aa990d09-operator-scripts\") pod \"placement-ea93-account-create-update-k5pbr\" (UID: \"64a509f5-e0df-432f-aa48-1c59aa990d09\") " pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.377528 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qxcd\" (UniqueName: \"kubernetes.io/projected/64a509f5-e0df-432f-aa48-1c59aa990d09-kube-api-access-9qxcd\") pod \"placement-ea93-account-create-update-k5pbr\" (UID: \"64a509f5-e0df-432f-aa48-1c59aa990d09\") " pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:33 crc kubenswrapper[4803]: E0320 17:33:33.377630 4803 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:33:33 crc kubenswrapper[4803]: E0320 17:33:33.377661 4803 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:33:33 crc kubenswrapper[4803]: E0320 17:33:33.379937 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift podName:8d6ece75-fa3d-4695-ada9-6c5ec4b580a7 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:41.379919191 +0000 UTC m=+1031.291511261 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift") pod "swift-storage-0" (UID: "8d6ece75-fa3d-4695-ada9-6c5ec4b580a7") : configmap "swift-ring-files" not found Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.479931 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64a509f5-e0df-432f-aa48-1c59aa990d09-operator-scripts\") pod \"placement-ea93-account-create-update-k5pbr\" (UID: \"64a509f5-e0df-432f-aa48-1c59aa990d09\") " pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.480037 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qxcd\" (UniqueName: \"kubernetes.io/projected/64a509f5-e0df-432f-aa48-1c59aa990d09-kube-api-access-9qxcd\") pod \"placement-ea93-account-create-update-k5pbr\" (UID: \"64a509f5-e0df-432f-aa48-1c59aa990d09\") " pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.481193 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64a509f5-e0df-432f-aa48-1c59aa990d09-operator-scripts\") pod \"placement-ea93-account-create-update-k5pbr\" (UID: \"64a509f5-e0df-432f-aa48-1c59aa990d09\") " pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.483146 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.495683 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qxcd\" (UniqueName: \"kubernetes.io/projected/64a509f5-e0df-432f-aa48-1c59aa990d09-kube-api-access-9qxcd\") pod \"placement-ea93-account-create-update-k5pbr\" (UID: \"64a509f5-e0df-432f-aa48-1c59aa990d09\") " pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.590886 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.815570 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.886241 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8htlr\" (UniqueName: \"kubernetes.io/projected/8f3c5d60-6930-4e05-8188-dd1c237d948f-kube-api-access-8htlr\") pod \"8f3c5d60-6930-4e05-8188-dd1c237d948f\" (UID: \"8f3c5d60-6930-4e05-8188-dd1c237d948f\") " Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.886328 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f3c5d60-6930-4e05-8188-dd1c237d948f-operator-scripts\") pod \"8f3c5d60-6930-4e05-8188-dd1c237d948f\" (UID: \"8f3c5d60-6930-4e05-8188-dd1c237d948f\") " Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.887104 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f3c5d60-6930-4e05-8188-dd1c237d948f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f3c5d60-6930-4e05-8188-dd1c237d948f" (UID: "8f3c5d60-6930-4e05-8188-dd1c237d948f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.910470 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f3c5d60-6930-4e05-8188-dd1c237d948f-kube-api-access-8htlr" (OuterVolumeSpecName: "kube-api-access-8htlr") pod "8f3c5d60-6930-4e05-8188-dd1c237d948f" (UID: "8f3c5d60-6930-4e05-8188-dd1c237d948f"). InnerVolumeSpecName "kube-api-access-8htlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.987960 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f3c5d60-6930-4e05-8188-dd1c237d948f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:33 crc kubenswrapper[4803]: I0320 17:33:33.987990 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8htlr\" (UniqueName: \"kubernetes.io/projected/8f3c5d60-6930-4e05-8188-dd1c237d948f-kube-api-access-8htlr\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:34 crc kubenswrapper[4803]: I0320 17:33:34.182510 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-59vqb" event={"ID":"8f3c5d60-6930-4e05-8188-dd1c237d948f","Type":"ContainerDied","Data":"7e85caa3f307b049bfc13a61397557f886f3f31575daeb65ea9aeb964ca68a13"} Mar 20 17:33:34 crc kubenswrapper[4803]: I0320 17:33:34.182559 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e85caa3f307b049bfc13a61397557f886f3f31575daeb65ea9aeb964ca68a13" Mar 20 17:33:34 crc kubenswrapper[4803]: I0320 17:33:34.182605 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-59vqb" Mar 20 17:33:34 crc kubenswrapper[4803]: I0320 17:33:34.665728 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:33:34 crc kubenswrapper[4803]: I0320 17:33:34.726420 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-88mf7"] Mar 20 17:33:34 crc kubenswrapper[4803]: I0320 17:33:34.726688 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-88mf7" podUID="afc78685-0e55-47af-9fd4-d800f0779296" containerName="dnsmasq-dns" containerID="cri-o://71271357f6b7072ec5fce46d5f0262d677141ba01d0f8d3c4dad4253d184b332" gracePeriod=10 Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.212269 4803 generic.go:334] "Generic (PLEG): container finished" podID="afc78685-0e55-47af-9fd4-d800f0779296" containerID="71271357f6b7072ec5fce46d5f0262d677141ba01d0f8d3c4dad4253d184b332" exitCode=0 Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.212600 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-88mf7" event={"ID":"afc78685-0e55-47af-9fd4-d800f0779296","Type":"ContainerDied","Data":"71271357f6b7072ec5fce46d5f0262d677141ba01d0f8d3c4dad4253d184b332"} Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.283150 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-59vqb"] Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.289055 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-59vqb"] Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.353484 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.519281 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-dns-svc\") pod \"afc78685-0e55-47af-9fd4-d800f0779296\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.519445 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-nb\") pod \"afc78685-0e55-47af-9fd4-d800f0779296\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.519491 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffwgj\" (UniqueName: \"kubernetes.io/projected/afc78685-0e55-47af-9fd4-d800f0779296-kube-api-access-ffwgj\") pod \"afc78685-0e55-47af-9fd4-d800f0779296\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.519543 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-config\") pod \"afc78685-0e55-47af-9fd4-d800f0779296\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.519570 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-sb\") pod \"afc78685-0e55-47af-9fd4-d800f0779296\" (UID: \"afc78685-0e55-47af-9fd4-d800f0779296\") " Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.528705 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afc78685-0e55-47af-9fd4-d800f0779296-kube-api-access-ffwgj" (OuterVolumeSpecName: "kube-api-access-ffwgj") pod "afc78685-0e55-47af-9fd4-d800f0779296" (UID: "afc78685-0e55-47af-9fd4-d800f0779296"). InnerVolumeSpecName "kube-api-access-ffwgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.556090 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ht9gr"] Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.569648 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-config" (OuterVolumeSpecName: "config") pod "afc78685-0e55-47af-9fd4-d800f0779296" (UID: "afc78685-0e55-47af-9fd4-d800f0779296"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.599019 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "afc78685-0e55-47af-9fd4-d800f0779296" (UID: "afc78685-0e55-47af-9fd4-d800f0779296"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.606160 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "afc78685-0e55-47af-9fd4-d800f0779296" (UID: "afc78685-0e55-47af-9fd4-d800f0779296"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.609579 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "afc78685-0e55-47af-9fd4-d800f0779296" (UID: "afc78685-0e55-47af-9fd4-d800f0779296"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.621982 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.622017 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.622031 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.622042 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/afc78685-0e55-47af-9fd4-d800f0779296-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.622053 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffwgj\" (UniqueName: \"kubernetes.io/projected/afc78685-0e55-47af-9fd4-d800f0779296-kube-api-access-ffwgj\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.672080 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-91c6-account-create-update-pvxqz"] Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.678764 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kxd5z"] Mar 20 17:33:35 crc kubenswrapper[4803]: W0320 17:33:35.679493 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba41ebed_4eb8_44f5_bcfa_7ac4f32b9afb.slice/crio-77a3fd2d48aa3b1ebab26e66ad873f3cb2ef9be98a049370db8b23a1c3ad6895 WatchSource:0}: Error finding container 77a3fd2d48aa3b1ebab26e66ad873f3cb2ef9be98a049370db8b23a1c3ad6895: Status 404 returned error can't find the container with id 77a3fd2d48aa3b1ebab26e66ad873f3cb2ef9be98a049370db8b23a1c3ad6895 Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.685567 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3d78-account-create-update-7dwj2"] Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.810541 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ea93-account-create-update-k5pbr"] Mar 20 17:33:35 crc kubenswrapper[4803]: W0320 17:33:35.814293 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64a509f5_e0df_432f_aa48_1c59aa990d09.slice/crio-7d68f02a095d5262fe4c1612e966a809ed640eef77934466633c874ca414b6b2 WatchSource:0}: Error finding container 7d68f02a095d5262fe4c1612e966a809ed640eef77934466633c874ca414b6b2: Status 404 returned error can't find the container with id 7d68f02a095d5262fe4c1612e966a809ed640eef77934466633c874ca414b6b2 Mar 20 17:33:35 crc kubenswrapper[4803]: I0320 17:33:35.872677 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4c46w"] Mar 20 17:33:35 crc kubenswrapper[4803]: W0320 17:33:35.890880 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95eec546_fa73_4572_8833_a92c4c020052.slice/crio-b5cc3d85c6940af69f6c77a12c7b722f2b703c18f18e5c4d9e1a99f4b1c063e9 WatchSource:0}: Error finding container b5cc3d85c6940af69f6c77a12c7b722f2b703c18f18e5c4d9e1a99f4b1c063e9: Status 404 returned error can't find the container with id b5cc3d85c6940af69f6c77a12c7b722f2b703c18f18e5c4d9e1a99f4b1c063e9 Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.220407 4803 generic.go:334] "Generic (PLEG): container finished" podID="9ea12874-ecdb-41be-a9d0-5d5ee5e57c82" containerID="054a8d5e8e8739faae07ac7fe79d328ced4d370f2b2fef4dc90a20397c2e0daf" exitCode=0 Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.220489 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ht9gr" event={"ID":"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82","Type":"ContainerDied","Data":"054a8d5e8e8739faae07ac7fe79d328ced4d370f2b2fef4dc90a20397c2e0daf"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.220785 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ht9gr" event={"ID":"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82","Type":"ContainerStarted","Data":"f0b4d9a47b2f89c456089bbad89d49b65ee8a4ae3e183c91cd6d2cb948bb9096"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.223080 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4c46w" event={"ID":"95eec546-fa73-4572-8833-a92c4c020052","Type":"ContainerStarted","Data":"0a01bbe65e7decf9a49a95da813e2ccfce4aa71e1bb177902eaaa6e71375b94f"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.223106 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4c46w" event={"ID":"95eec546-fa73-4572-8833-a92c4c020052","Type":"ContainerStarted","Data":"b5cc3d85c6940af69f6c77a12c7b722f2b703c18f18e5c4d9e1a99f4b1c063e9"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.225349 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-88mf7" event={"ID":"afc78685-0e55-47af-9fd4-d800f0779296","Type":"ContainerDied","Data":"3886f7f012633d67425ed6fb9f505f95d6f5be0b6730323335c9d85bb1d97ffa"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.225380 4803 scope.go:117] "RemoveContainer" containerID="71271357f6b7072ec5fce46d5f0262d677141ba01d0f8d3c4dad4253d184b332" Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.225515 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-88mf7" Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.230340 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2d9wv" event={"ID":"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda","Type":"ContainerStarted","Data":"0b2e060cc80b747cc960efab2733d612cef2fdf14a9d6f4d709ea11d5676040e"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.234050 4803 generic.go:334] "Generic (PLEG): container finished" podID="64a509f5-e0df-432f-aa48-1c59aa990d09" containerID="6f9fd10105318d191923863006281bacde7b8b94ad39448a0cd58552195a922d" exitCode=0 Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.234097 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ea93-account-create-update-k5pbr" event={"ID":"64a509f5-e0df-432f-aa48-1c59aa990d09","Type":"ContainerDied","Data":"6f9fd10105318d191923863006281bacde7b8b94ad39448a0cd58552195a922d"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.234114 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ea93-account-create-update-k5pbr" event={"ID":"64a509f5-e0df-432f-aa48-1c59aa990d09","Type":"ContainerStarted","Data":"7d68f02a095d5262fe4c1612e966a809ed640eef77934466633c874ca414b6b2"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.240190 4803 generic.go:334] "Generic (PLEG): container finished" podID="b5da372c-0b9a-4cbe-933a-a29f90ef2db6" containerID="55a8a2be8373fbfafefdb8b0b3bb484de9bba9af837f6c1e0ae3a642c48a1793" exitCode=0 Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.240239 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3d78-account-create-update-7dwj2" event={"ID":"b5da372c-0b9a-4cbe-933a-a29f90ef2db6","Type":"ContainerDied","Data":"55a8a2be8373fbfafefdb8b0b3bb484de9bba9af837f6c1e0ae3a642c48a1793"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.240256 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3d78-account-create-update-7dwj2" event={"ID":"b5da372c-0b9a-4cbe-933a-a29f90ef2db6","Type":"ContainerStarted","Data":"a3ed29e61face91ca4497c4b61bd3817ecf03b3a08fe79799204537d88685896"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.242156 4803 generic.go:334] "Generic (PLEG): container finished" podID="ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb" containerID="7c956abc5b56c59be9108d5a8dfde4852838a3032ca83a8d03d0298c2d65df92" exitCode=0 Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.242208 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-91c6-account-create-update-pvxqz" event={"ID":"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb","Type":"ContainerDied","Data":"7c956abc5b56c59be9108d5a8dfde4852838a3032ca83a8d03d0298c2d65df92"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.242231 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-91c6-account-create-update-pvxqz" event={"ID":"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb","Type":"ContainerStarted","Data":"77a3fd2d48aa3b1ebab26e66ad873f3cb2ef9be98a049370db8b23a1c3ad6895"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.243984 4803 generic.go:334] "Generic (PLEG): container finished" podID="0c33e7b8-552f-4505-ab8d-40fe6c121314" containerID="9b7f4335a75ea34ea1b5be584102f7e23bc75a644f114489b39bdea7f4a77f5f" exitCode=0 Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.244146 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kxd5z" event={"ID":"0c33e7b8-552f-4505-ab8d-40fe6c121314","Type":"ContainerDied","Data":"9b7f4335a75ea34ea1b5be584102f7e23bc75a644f114489b39bdea7f4a77f5f"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.246616 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kxd5z" event={"ID":"0c33e7b8-552f-4505-ab8d-40fe6c121314","Type":"ContainerStarted","Data":"ce2814c1c8e8b9be0f6221cfc73113cde76d9f10aba830f91e7581c1bd8b1b59"} Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.253388 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-4c46w" podStartSLOduration=4.25331942 podStartE2EDuration="4.25331942s" podCreationTimestamp="2026-03-20 17:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:36.244807654 +0000 UTC m=+1026.156399734" watchObservedRunningTime="2026-03-20 17:33:36.25331942 +0000 UTC m=+1026.164911500" Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.263495 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2d9wv" podStartSLOduration=2.35702042 podStartE2EDuration="7.263481972s" podCreationTimestamp="2026-03-20 17:33:29 +0000 UTC" firstStartedPulling="2026-03-20 17:33:30.256244711 +0000 UTC m=+1020.167836781" lastFinishedPulling="2026-03-20 17:33:35.162706263 +0000 UTC m=+1025.074298333" observedRunningTime="2026-03-20 17:33:36.256171742 +0000 UTC m=+1026.167763812" watchObservedRunningTime="2026-03-20 17:33:36.263481972 +0000 UTC m=+1026.175074042" Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.407560 4803 scope.go:117] "RemoveContainer" containerID="bba264b2d8d5f7f446aabec57859dc52c0dd1ac49814cc0246acb823e0edfebe" Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.427941 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-88mf7"] Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.442581 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-88mf7"] Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.860592 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f3c5d60-6930-4e05-8188-dd1c237d948f" path="/var/lib/kubelet/pods/8f3c5d60-6930-4e05-8188-dd1c237d948f/volumes" Mar 20 17:33:36 crc kubenswrapper[4803]: I0320 17:33:36.861623 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afc78685-0e55-47af-9fd4-d800f0779296" path="/var/lib/kubelet/pods/afc78685-0e55-47af-9fd4-d800f0779296/volumes" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.268456 4803 generic.go:334] "Generic (PLEG): container finished" podID="95eec546-fa73-4572-8833-a92c4c020052" containerID="0a01bbe65e7decf9a49a95da813e2ccfce4aa71e1bb177902eaaa6e71375b94f" exitCode=0 Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.268617 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4c46w" event={"ID":"95eec546-fa73-4572-8833-a92c4c020052","Type":"ContainerDied","Data":"0a01bbe65e7decf9a49a95da813e2ccfce4aa71e1bb177902eaaa6e71375b94f"} Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.733014 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.775932 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-operator-scripts\") pod \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\" (UID: \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.776012 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srzjb\" (UniqueName: \"kubernetes.io/projected/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-kube-api-access-srzjb\") pod \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\" (UID: \"b5da372c-0b9a-4cbe-933a-a29f90ef2db6\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.776975 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b5da372c-0b9a-4cbe-933a-a29f90ef2db6" (UID: "b5da372c-0b9a-4cbe-933a-a29f90ef2db6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.799885 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-kube-api-access-srzjb" (OuterVolumeSpecName: "kube-api-access-srzjb") pod "b5da372c-0b9a-4cbe-933a-a29f90ef2db6" (UID: "b5da372c-0b9a-4cbe-933a-a29f90ef2db6"). InnerVolumeSpecName "kube-api-access-srzjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.872815 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.877208 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-operator-scripts\") pod \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\" (UID: \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.877412 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kktz7\" (UniqueName: \"kubernetes.io/projected/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-kube-api-access-kktz7\") pod \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\" (UID: \"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.877784 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb" (UID: "ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.880191 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srzjb\" (UniqueName: \"kubernetes.io/projected/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-kube-api-access-srzjb\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.883683 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.883742 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b5da372c-0b9a-4cbe-933a-a29f90ef2db6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.880714 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-kube-api-access-kktz7" (OuterVolumeSpecName: "kube-api-access-kktz7") pod "ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb" (UID: "ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb"). InnerVolumeSpecName "kube-api-access-kktz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.881108 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.921254 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.928078 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.984760 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c33e7b8-552f-4505-ab8d-40fe6c121314-operator-scripts\") pod \"0c33e7b8-552f-4505-ab8d-40fe6c121314\" (UID: \"0c33e7b8-552f-4505-ab8d-40fe6c121314\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.984820 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-operator-scripts\") pod \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\" (UID: \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.984878 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qxcd\" (UniqueName: \"kubernetes.io/projected/64a509f5-e0df-432f-aa48-1c59aa990d09-kube-api-access-9qxcd\") pod \"64a509f5-e0df-432f-aa48-1c59aa990d09\" (UID: \"64a509f5-e0df-432f-aa48-1c59aa990d09\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.984898 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgzt2\" (UniqueName: \"kubernetes.io/projected/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-kube-api-access-lgzt2\") pod \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\" (UID: \"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.984966 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx7s9\" (UniqueName: \"kubernetes.io/projected/0c33e7b8-552f-4505-ab8d-40fe6c121314-kube-api-access-xx7s9\") pod \"0c33e7b8-552f-4505-ab8d-40fe6c121314\" (UID: \"0c33e7b8-552f-4505-ab8d-40fe6c121314\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.985038 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64a509f5-e0df-432f-aa48-1c59aa990d09-operator-scripts\") pod \"64a509f5-e0df-432f-aa48-1c59aa990d09\" (UID: \"64a509f5-e0df-432f-aa48-1c59aa990d09\") " Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.985331 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c33e7b8-552f-4505-ab8d-40fe6c121314-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c33e7b8-552f-4505-ab8d-40fe6c121314" (UID: "0c33e7b8-552f-4505-ab8d-40fe6c121314"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.985494 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ea12874-ecdb-41be-a9d0-5d5ee5e57c82" (UID: "9ea12874-ecdb-41be-a9d0-5d5ee5e57c82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.985949 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64a509f5-e0df-432f-aa48-1c59aa990d09-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64a509f5-e0df-432f-aa48-1c59aa990d09" (UID: "64a509f5-e0df-432f-aa48-1c59aa990d09"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.985952 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kktz7\" (UniqueName: \"kubernetes.io/projected/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb-kube-api-access-kktz7\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.986008 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c33e7b8-552f-4505-ab8d-40fe6c121314-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.986078 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.990813 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-kube-api-access-lgzt2" (OuterVolumeSpecName: "kube-api-access-lgzt2") pod "9ea12874-ecdb-41be-a9d0-5d5ee5e57c82" (UID: "9ea12874-ecdb-41be-a9d0-5d5ee5e57c82"). InnerVolumeSpecName "kube-api-access-lgzt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.990883 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a509f5-e0df-432f-aa48-1c59aa990d09-kube-api-access-9qxcd" (OuterVolumeSpecName: "kube-api-access-9qxcd") pod "64a509f5-e0df-432f-aa48-1c59aa990d09" (UID: "64a509f5-e0df-432f-aa48-1c59aa990d09"). InnerVolumeSpecName "kube-api-access-9qxcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:37 crc kubenswrapper[4803]: I0320 17:33:37.990905 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c33e7b8-552f-4505-ab8d-40fe6c121314-kube-api-access-xx7s9" (OuterVolumeSpecName: "kube-api-access-xx7s9") pod "0c33e7b8-552f-4505-ab8d-40fe6c121314" (UID: "0c33e7b8-552f-4505-ab8d-40fe6c121314"). InnerVolumeSpecName "kube-api-access-xx7s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.087840 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qxcd\" (UniqueName: \"kubernetes.io/projected/64a509f5-e0df-432f-aa48-1c59aa990d09-kube-api-access-9qxcd\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.087884 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgzt2\" (UniqueName: \"kubernetes.io/projected/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82-kube-api-access-lgzt2\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.087897 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx7s9\" (UniqueName: \"kubernetes.io/projected/0c33e7b8-552f-4505-ab8d-40fe6c121314-kube-api-access-xx7s9\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.087909 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64a509f5-e0df-432f-aa48-1c59aa990d09-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.291579 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ea93-account-create-update-k5pbr" event={"ID":"64a509f5-e0df-432f-aa48-1c59aa990d09","Type":"ContainerDied","Data":"7d68f02a095d5262fe4c1612e966a809ed640eef77934466633c874ca414b6b2"} Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.291640 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d68f02a095d5262fe4c1612e966a809ed640eef77934466633c874ca414b6b2" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.291762 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ea93-account-create-update-k5pbr" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.294750 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3d78-account-create-update-7dwj2" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.294967 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3d78-account-create-update-7dwj2" event={"ID":"b5da372c-0b9a-4cbe-933a-a29f90ef2db6","Type":"ContainerDied","Data":"a3ed29e61face91ca4497c4b61bd3817ecf03b3a08fe79799204537d88685896"} Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.295128 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ed29e61face91ca4497c4b61bd3817ecf03b3a08fe79799204537d88685896" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.299774 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-91c6-account-create-update-pvxqz" event={"ID":"ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb","Type":"ContainerDied","Data":"77a3fd2d48aa3b1ebab26e66ad873f3cb2ef9be98a049370db8b23a1c3ad6895"} Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.299976 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77a3fd2d48aa3b1ebab26e66ad873f3cb2ef9be98a049370db8b23a1c3ad6895" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.299826 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-91c6-account-create-update-pvxqz" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.303112 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kxd5z" event={"ID":"0c33e7b8-552f-4505-ab8d-40fe6c121314","Type":"ContainerDied","Data":"ce2814c1c8e8b9be0f6221cfc73113cde76d9f10aba830f91e7581c1bd8b1b59"} Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.303200 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce2814c1c8e8b9be0f6221cfc73113cde76d9f10aba830f91e7581c1bd8b1b59" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.303313 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kxd5z" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.305489 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ht9gr" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.305752 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ht9gr" event={"ID":"9ea12874-ecdb-41be-a9d0-5d5ee5e57c82","Type":"ContainerDied","Data":"f0b4d9a47b2f89c456089bbad89d49b65ee8a4ae3e183c91cd6d2cb948bb9096"} Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.305797 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0b4d9a47b2f89c456089bbad89d49b65ee8a4ae3e183c91cd6d2cb948bb9096" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.613041 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4c46w" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.798937 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95eec546-fa73-4572-8833-a92c4c020052-operator-scripts\") pod \"95eec546-fa73-4572-8833-a92c4c020052\" (UID: \"95eec546-fa73-4572-8833-a92c4c020052\") " Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.799573 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvdk\" (UniqueName: \"kubernetes.io/projected/95eec546-fa73-4572-8833-a92c4c020052-kube-api-access-wkvdk\") pod \"95eec546-fa73-4572-8833-a92c4c020052\" (UID: \"95eec546-fa73-4572-8833-a92c4c020052\") " Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.801249 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95eec546-fa73-4572-8833-a92c4c020052-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95eec546-fa73-4572-8833-a92c4c020052" (UID: "95eec546-fa73-4572-8833-a92c4c020052"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.812851 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95eec546-fa73-4572-8833-a92c4c020052-kube-api-access-wkvdk" (OuterVolumeSpecName: "kube-api-access-wkvdk") pod "95eec546-fa73-4572-8833-a92c4c020052" (UID: "95eec546-fa73-4572-8833-a92c4c020052"). InnerVolumeSpecName "kube-api-access-wkvdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.902565 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95eec546-fa73-4572-8833-a92c4c020052-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:38 crc kubenswrapper[4803]: I0320 17:33:38.902605 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvdk\" (UniqueName: \"kubernetes.io/projected/95eec546-fa73-4572-8833-a92c4c020052-kube-api-access-wkvdk\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002211 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-97snk"] Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002619 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a509f5-e0df-432f-aa48-1c59aa990d09" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002636 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a509f5-e0df-432f-aa48-1c59aa990d09" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002648 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eec546-fa73-4572-8833-a92c4c020052" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002656 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eec546-fa73-4572-8833-a92c4c020052" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002668 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ea12874-ecdb-41be-a9d0-5d5ee5e57c82" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002673 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ea12874-ecdb-41be-a9d0-5d5ee5e57c82" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002685 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3c5d60-6930-4e05-8188-dd1c237d948f" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002692 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3c5d60-6930-4e05-8188-dd1c237d948f" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002704 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002710 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002724 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5da372c-0b9a-4cbe-933a-a29f90ef2db6" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002729 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5da372c-0b9a-4cbe-933a-a29f90ef2db6" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002742 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c33e7b8-552f-4505-ab8d-40fe6c121314" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002748 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c33e7b8-552f-4505-ab8d-40fe6c121314" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002758 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc78685-0e55-47af-9fd4-d800f0779296" containerName="init" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002764 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc78685-0e55-47af-9fd4-d800f0779296" containerName="init" Mar 20 17:33:39 crc kubenswrapper[4803]: E0320 17:33:39.002776 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc78685-0e55-47af-9fd4-d800f0779296" containerName="dnsmasq-dns" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002781 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc78685-0e55-47af-9fd4-d800f0779296" containerName="dnsmasq-dns" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002926 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="95eec546-fa73-4572-8833-a92c4c020052" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002938 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ea12874-ecdb-41be-a9d0-5d5ee5e57c82" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002947 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f3c5d60-6930-4e05-8188-dd1c237d948f" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002956 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc78685-0e55-47af-9fd4-d800f0779296" containerName="dnsmasq-dns" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002964 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002974 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5da372c-0b9a-4cbe-933a-a29f90ef2db6" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002987 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a509f5-e0df-432f-aa48-1c59aa990d09" containerName="mariadb-account-create-update" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.002998 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c33e7b8-552f-4505-ab8d-40fe6c121314" containerName="mariadb-database-create" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.003463 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-97snk"] Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.004166 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-97snk" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.008545 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.107120 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l52r\" (UniqueName: \"kubernetes.io/projected/73088a52-2ee5-4583-9667-41d5512b5616-kube-api-access-9l52r\") pod \"root-account-create-update-97snk\" (UID: \"73088a52-2ee5-4583-9667-41d5512b5616\") " pod="openstack/root-account-create-update-97snk" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.107199 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73088a52-2ee5-4583-9667-41d5512b5616-operator-scripts\") pod \"root-account-create-update-97snk\" (UID: \"73088a52-2ee5-4583-9667-41d5512b5616\") " pod="openstack/root-account-create-update-97snk" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.208693 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l52r\" (UniqueName: \"kubernetes.io/projected/73088a52-2ee5-4583-9667-41d5512b5616-kube-api-access-9l52r\") pod \"root-account-create-update-97snk\" (UID: \"73088a52-2ee5-4583-9667-41d5512b5616\") " pod="openstack/root-account-create-update-97snk" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.208760 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73088a52-2ee5-4583-9667-41d5512b5616-operator-scripts\") pod \"root-account-create-update-97snk\" (UID: \"73088a52-2ee5-4583-9667-41d5512b5616\") " pod="openstack/root-account-create-update-97snk" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.209692 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73088a52-2ee5-4583-9667-41d5512b5616-operator-scripts\") pod \"root-account-create-update-97snk\" (UID: \"73088a52-2ee5-4583-9667-41d5512b5616\") " pod="openstack/root-account-create-update-97snk" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.236336 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l52r\" (UniqueName: \"kubernetes.io/projected/73088a52-2ee5-4583-9667-41d5512b5616-kube-api-access-9l52r\") pod \"root-account-create-update-97snk\" (UID: \"73088a52-2ee5-4583-9667-41d5512b5616\") " pod="openstack/root-account-create-update-97snk" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.315383 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4c46w" event={"ID":"95eec546-fa73-4572-8833-a92c4c020052","Type":"ContainerDied","Data":"b5cc3d85c6940af69f6c77a12c7b722f2b703c18f18e5c4d9e1a99f4b1c063e9"} Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.315444 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5cc3d85c6940af69f6c77a12c7b722f2b703c18f18e5c4d9e1a99f4b1c063e9" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.315614 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4c46w" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.352866 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-97snk" Mar 20 17:33:39 crc kubenswrapper[4803]: I0320 17:33:39.812346 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-97snk"] Mar 20 17:33:40 crc kubenswrapper[4803]: I0320 17:33:40.325234 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-97snk" event={"ID":"73088a52-2ee5-4583-9667-41d5512b5616","Type":"ContainerStarted","Data":"3e8d9056a94901a2b63f8bc1d1d0b3b9089a2b35fb17335fd05ab99300665496"} Mar 20 17:33:41 crc kubenswrapper[4803]: I0320 17:33:41.047280 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 17:33:41 crc kubenswrapper[4803]: I0320 17:33:41.453075 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:41 crc kubenswrapper[4803]: E0320 17:33:41.453633 4803 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 17:33:41 crc kubenswrapper[4803]: E0320 17:33:41.453651 4803 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 17:33:41 crc kubenswrapper[4803]: E0320 17:33:41.453697 4803 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift podName:8d6ece75-fa3d-4695-ada9-6c5ec4b580a7 nodeName:}" failed. No retries permitted until 2026-03-20 17:33:57.453681269 +0000 UTC m=+1047.365273339 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift") pod "swift-storage-0" (UID: "8d6ece75-fa3d-4695-ada9-6c5ec4b580a7") : configmap "swift-ring-files" not found Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.423975 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ww57h"] Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.425225 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.427874 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.428118 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6nn79" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.450501 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ww57h"] Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.572834 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-combined-ca-bundle\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.572913 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-db-sync-config-data\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.572968 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8rqg\" (UniqueName: \"kubernetes.io/projected/b7c27f17-4261-4a5c-830f-687a37c483fe-kube-api-access-v8rqg\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.572985 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-config-data\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.674007 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-combined-ca-bundle\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.674096 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-db-sync-config-data\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.674129 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8rqg\" (UniqueName: \"kubernetes.io/projected/b7c27f17-4261-4a5c-830f-687a37c483fe-kube-api-access-v8rqg\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.674160 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-config-data\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.679964 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-db-sync-config-data\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.680138 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-combined-ca-bundle\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.681040 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-config-data\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.703216 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8rqg\" (UniqueName: \"kubernetes.io/projected/b7c27f17-4261-4a5c-830f-687a37c483fe-kube-api-access-v8rqg\") pod \"glance-db-sync-ww57h\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:42 crc kubenswrapper[4803]: I0320 17:33:42.751498 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ww57h" Mar 20 17:33:43 crc kubenswrapper[4803]: I0320 17:33:43.324930 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ww57h"] Mar 20 17:33:43 crc kubenswrapper[4803]: W0320 17:33:43.325789 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7c27f17_4261_4a5c_830f_687a37c483fe.slice/crio-217a25426b35e6093fa98afb6fe5fd34427696a6ff86960ec4be6e1dbf35bd67 WatchSource:0}: Error finding container 217a25426b35e6093fa98afb6fe5fd34427696a6ff86960ec4be6e1dbf35bd67: Status 404 returned error can't find the container with id 217a25426b35e6093fa98afb6fe5fd34427696a6ff86960ec4be6e1dbf35bd67 Mar 20 17:33:43 crc kubenswrapper[4803]: I0320 17:33:43.347464 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-97snk" event={"ID":"73088a52-2ee5-4583-9667-41d5512b5616","Type":"ContainerStarted","Data":"7721ae2915ccc174f5844f83fbe4d12b05cabfb5654cd7e3b701902b9652f8d7"} Mar 20 17:33:43 crc kubenswrapper[4803]: I0320 17:33:43.348535 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ww57h" event={"ID":"b7c27f17-4261-4a5c-830f-687a37c483fe","Type":"ContainerStarted","Data":"217a25426b35e6093fa98afb6fe5fd34427696a6ff86960ec4be6e1dbf35bd67"} Mar 20 17:33:43 crc kubenswrapper[4803]: I0320 17:33:43.371239 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-97snk" podStartSLOduration=5.3712144219999995 podStartE2EDuration="5.371214422s" podCreationTimestamp="2026-03-20 17:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:43.359899856 +0000 UTC m=+1033.271491956" watchObservedRunningTime="2026-03-20 17:33:43.371214422 +0000 UTC m=+1033.282806522" Mar 20 17:33:44 crc kubenswrapper[4803]: I0320 17:33:44.362987 4803 generic.go:334] "Generic (PLEG): container finished" podID="e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" containerID="0b2e060cc80b747cc960efab2733d612cef2fdf14a9d6f4d709ea11d5676040e" exitCode=0 Mar 20 17:33:44 crc kubenswrapper[4803]: I0320 17:33:44.363080 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2d9wv" event={"ID":"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda","Type":"ContainerDied","Data":"0b2e060cc80b747cc960efab2733d612cef2fdf14a9d6f4d709ea11d5676040e"} Mar 20 17:33:44 crc kubenswrapper[4803]: I0320 17:33:44.366137 4803 generic.go:334] "Generic (PLEG): container finished" podID="73088a52-2ee5-4583-9667-41d5512b5616" containerID="7721ae2915ccc174f5844f83fbe4d12b05cabfb5654cd7e3b701902b9652f8d7" exitCode=0 Mar 20 17:33:44 crc kubenswrapper[4803]: I0320 17:33:44.366193 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-97snk" event={"ID":"73088a52-2ee5-4583-9667-41d5512b5616","Type":"ContainerDied","Data":"7721ae2915ccc174f5844f83fbe4d12b05cabfb5654cd7e3b701902b9652f8d7"} Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.676709 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.726213 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-etc-swift\") pod \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.726257 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-swiftconf\") pod \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.726271 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-dispersionconf\") pod \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.726310 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-scripts\") pod \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.727095 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" (UID: "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.737062 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" (UID: "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.750076 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-scripts" (OuterVolumeSpecName: "scripts") pod "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" (UID: "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.753941 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" (UID: "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.801472 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-97snk" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828239 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73088a52-2ee5-4583-9667-41d5512b5616-operator-scripts\") pod \"73088a52-2ee5-4583-9667-41d5512b5616\" (UID: \"73088a52-2ee5-4583-9667-41d5512b5616\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828309 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-ring-data-devices\") pod \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828334 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-combined-ca-bundle\") pod \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828391 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trv8v\" (UniqueName: \"kubernetes.io/projected/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-kube-api-access-trv8v\") pod \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\" (UID: \"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828426 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l52r\" (UniqueName: \"kubernetes.io/projected/73088a52-2ee5-4583-9667-41d5512b5616-kube-api-access-9l52r\") pod \"73088a52-2ee5-4583-9667-41d5512b5616\" (UID: \"73088a52-2ee5-4583-9667-41d5512b5616\") " Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828710 4803 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828730 4803 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828743 4803 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.828754 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.829900 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" (UID: "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.830362 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73088a52-2ee5-4583-9667-41d5512b5616-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73088a52-2ee5-4583-9667-41d5512b5616" (UID: "73088a52-2ee5-4583-9667-41d5512b5616"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.833843 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-kube-api-access-trv8v" (OuterVolumeSpecName: "kube-api-access-trv8v") pod "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" (UID: "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda"). InnerVolumeSpecName "kube-api-access-trv8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.840845 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73088a52-2ee5-4583-9667-41d5512b5616-kube-api-access-9l52r" (OuterVolumeSpecName: "kube-api-access-9l52r") pod "73088a52-2ee5-4583-9667-41d5512b5616" (UID: "73088a52-2ee5-4583-9667-41d5512b5616"). InnerVolumeSpecName "kube-api-access-9l52r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.852769 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" (UID: "e846bdbc-0d6d-4dc2-9a0b-d188913b5eda"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.930407 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73088a52-2ee5-4583-9667-41d5512b5616-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.930469 4803 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.930482 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.930516 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trv8v\" (UniqueName: \"kubernetes.io/projected/e846bdbc-0d6d-4dc2-9a0b-d188913b5eda-kube-api-access-trv8v\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:45 crc kubenswrapper[4803]: I0320 17:33:45.930581 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l52r\" (UniqueName: \"kubernetes.io/projected/73088a52-2ee5-4583-9667-41d5512b5616-kube-api-access-9l52r\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:46 crc kubenswrapper[4803]: I0320 17:33:46.383105 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2d9wv" event={"ID":"e846bdbc-0d6d-4dc2-9a0b-d188913b5eda","Type":"ContainerDied","Data":"e64481ee232fca76b5c09407b0852e3d683266405a78ed5f6814314918e36b8f"} Mar 20 17:33:46 crc kubenswrapper[4803]: I0320 17:33:46.383167 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e64481ee232fca76b5c09407b0852e3d683266405a78ed5f6814314918e36b8f" Mar 20 17:33:46 crc kubenswrapper[4803]: I0320 17:33:46.383291 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2d9wv" Mar 20 17:33:46 crc kubenswrapper[4803]: I0320 17:33:46.401465 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-97snk" event={"ID":"73088a52-2ee5-4583-9667-41d5512b5616","Type":"ContainerDied","Data":"3e8d9056a94901a2b63f8bc1d1d0b3b9089a2b35fb17335fd05ab99300665496"} Mar 20 17:33:46 crc kubenswrapper[4803]: I0320 17:33:46.401535 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e8d9056a94901a2b63f8bc1d1d0b3b9089a2b35fb17335fd05ab99300665496" Mar 20 17:33:46 crc kubenswrapper[4803]: I0320 17:33:46.401589 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-97snk" Mar 20 17:33:47 crc kubenswrapper[4803]: I0320 17:33:47.411162 4803 generic.go:334] "Generic (PLEG): container finished" podID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" containerID="cb59d50529a5a90c57fcd7f2353539848ddae2913f0390264bfdfe1eccbb70f0" exitCode=0 Mar 20 17:33:47 crc kubenswrapper[4803]: I0320 17:33:47.411226 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b4b88aa-f18b-40c9-b8ad-dbd3739565da","Type":"ContainerDied","Data":"cb59d50529a5a90c57fcd7f2353539848ddae2913f0390264bfdfe1eccbb70f0"} Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.013124 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-56z85" podUID="1d21c995-a420-4ac7-8cd9-c186be9e4ba0" containerName="ovn-controller" probeResult="failure" output=< Mar 20 17:33:48 crc kubenswrapper[4803]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 17:33:48 crc kubenswrapper[4803]: > Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.080181 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.086078 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l7cc6" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.308382 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-56z85-config-tws5n"] Mar 20 17:33:48 crc kubenswrapper[4803]: E0320 17:33:48.308731 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" containerName="swift-ring-rebalance" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.308747 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" containerName="swift-ring-rebalance" Mar 20 17:33:48 crc kubenswrapper[4803]: E0320 17:33:48.308761 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73088a52-2ee5-4583-9667-41d5512b5616" containerName="mariadb-account-create-update" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.308768 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="73088a52-2ee5-4583-9667-41d5512b5616" containerName="mariadb-account-create-update" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.308928 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="73088a52-2ee5-4583-9667-41d5512b5616" containerName="mariadb-account-create-update" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.308939 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e846bdbc-0d6d-4dc2-9a0b-d188913b5eda" containerName="swift-ring-rebalance" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.309448 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.311748 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.320482 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56z85-config-tws5n"] Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.421321 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b4b88aa-f18b-40c9-b8ad-dbd3739565da","Type":"ContainerStarted","Data":"3a6c86e5744baf1b54c823d9ba18a6394488ed3c40a8de67d330499157e57291"} Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.421497 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.422988 4803 generic.go:334] "Generic (PLEG): container finished" podID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerID="052d5539bc83f751f8ac5bfbf90e0df4d4248a9132e6e3ff8ad18e6caaa138b0" exitCode=0 Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.423088 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b537f5f-54f2-4c70-be7b-4c57f84c572c","Type":"ContainerDied","Data":"052d5539bc83f751f8ac5bfbf90e0df4d4248a9132e6e3ff8ad18e6caaa138b0"} Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.478586 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qs59\" (UniqueName: \"kubernetes.io/projected/d5dae348-d62b-4a26-b4f4-acc447641176-kube-api-access-2qs59\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.478638 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-additional-scripts\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.478666 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-scripts\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.478702 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.478732 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-log-ovn\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.478774 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run-ovn\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.491440 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.596260834 podStartE2EDuration="1m1.491416591s" podCreationTimestamp="2026-03-20 17:32:47 +0000 UTC" firstStartedPulling="2026-03-20 17:33:01.322615812 +0000 UTC m=+991.234207882" lastFinishedPulling="2026-03-20 17:33:14.217771569 +0000 UTC m=+1004.129363639" observedRunningTime="2026-03-20 17:33:48.472235128 +0000 UTC m=+1038.383827238" watchObservedRunningTime="2026-03-20 17:33:48.491416591 +0000 UTC m=+1038.403008671" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.580097 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run-ovn\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.580372 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run-ovn\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.580564 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qs59\" (UniqueName: \"kubernetes.io/projected/d5dae348-d62b-4a26-b4f4-acc447641176-kube-api-access-2qs59\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.580591 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-additional-scripts\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.580631 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-scripts\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.580681 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.580764 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-log-ovn\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.581683 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.582077 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-log-ovn\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.582601 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-additional-scripts\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.583123 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-scripts\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.608449 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qs59\" (UniqueName: \"kubernetes.io/projected/d5dae348-d62b-4a26-b4f4-acc447641176-kube-api-access-2qs59\") pod \"ovn-controller-56z85-config-tws5n\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:48 crc kubenswrapper[4803]: I0320 17:33:48.625236 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:49 crc kubenswrapper[4803]: I0320 17:33:49.105153 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-56z85-config-tws5n"] Mar 20 17:33:49 crc kubenswrapper[4803]: W0320 17:33:49.115982 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5dae348_d62b_4a26_b4f4_acc447641176.slice/crio-a8c1eb779c9c383b4c2be05e3704c73a3911e3858671d7bd2d068b2c291bfc2f WatchSource:0}: Error finding container a8c1eb779c9c383b4c2be05e3704c73a3911e3858671d7bd2d068b2c291bfc2f: Status 404 returned error can't find the container with id a8c1eb779c9c383b4c2be05e3704c73a3911e3858671d7bd2d068b2c291bfc2f Mar 20 17:33:49 crc kubenswrapper[4803]: I0320 17:33:49.435016 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56z85-config-tws5n" event={"ID":"d5dae348-d62b-4a26-b4f4-acc447641176","Type":"ContainerStarted","Data":"3702573a176569d3e3344cfbb185455ff929addee44889ff3fb569eadffcdde5"} Mar 20 17:33:49 crc kubenswrapper[4803]: I0320 17:33:49.435057 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56z85-config-tws5n" event={"ID":"d5dae348-d62b-4a26-b4f4-acc447641176","Type":"ContainerStarted","Data":"a8c1eb779c9c383b4c2be05e3704c73a3911e3858671d7bd2d068b2c291bfc2f"} Mar 20 17:33:49 crc kubenswrapper[4803]: I0320 17:33:49.439234 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b537f5f-54f2-4c70-be7b-4c57f84c572c","Type":"ContainerStarted","Data":"26089048cbbc614440decf4c08aa5e302585d08f70067e4479dbdf5e2efb25cb"} Mar 20 17:33:49 crc kubenswrapper[4803]: I0320 17:33:49.467812 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-56z85-config-tws5n" podStartSLOduration=1.46778794 podStartE2EDuration="1.46778794s" podCreationTimestamp="2026-03-20 17:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:49.461094469 +0000 UTC m=+1039.372686549" watchObservedRunningTime="2026-03-20 17:33:49.46778794 +0000 UTC m=+1039.379380020" Mar 20 17:33:49 crc kubenswrapper[4803]: I0320 17:33:49.499042 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=-9223371974.355757 podStartE2EDuration="1m2.499019964s" podCreationTimestamp="2026-03-20 17:32:47 +0000 UTC" firstStartedPulling="2026-03-20 17:33:00.869452285 +0000 UTC m=+990.781044355" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:49.495664113 +0000 UTC m=+1039.407256203" watchObservedRunningTime="2026-03-20 17:33:49.499019964 +0000 UTC m=+1039.410612044" Mar 20 17:33:50 crc kubenswrapper[4803]: I0320 17:33:50.349082 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-97snk"] Mar 20 17:33:50 crc kubenswrapper[4803]: I0320 17:33:50.354123 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-97snk"] Mar 20 17:33:50 crc kubenswrapper[4803]: I0320 17:33:50.500584 4803 generic.go:334] "Generic (PLEG): container finished" podID="d5dae348-d62b-4a26-b4f4-acc447641176" containerID="3702573a176569d3e3344cfbb185455ff929addee44889ff3fb569eadffcdde5" exitCode=0 Mar 20 17:33:50 crc kubenswrapper[4803]: I0320 17:33:50.500645 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56z85-config-tws5n" event={"ID":"d5dae348-d62b-4a26-b4f4-acc447641176","Type":"ContainerDied","Data":"3702573a176569d3e3344cfbb185455ff929addee44889ff3fb569eadffcdde5"} Mar 20 17:33:50 crc kubenswrapper[4803]: I0320 17:33:50.864636 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73088a52-2ee5-4583-9667-41d5512b5616" path="/var/lib/kubelet/pods/73088a52-2ee5-4583-9667-41d5512b5616/volumes" Mar 20 17:33:53 crc kubenswrapper[4803]: I0320 17:33:53.056042 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-56z85" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.644962 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-6j2lv"] Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.645991 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.649237 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.659683 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6j2lv"] Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.792828 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97f747a8-b136-428e-8a84-106573e537db-operator-scripts\") pod \"root-account-create-update-6j2lv\" (UID: \"97f747a8-b136-428e-8a84-106573e537db\") " pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.793175 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpcvj\" (UniqueName: \"kubernetes.io/projected/97f747a8-b136-428e-8a84-106573e537db-kube-api-access-tpcvj\") pod \"root-account-create-update-6j2lv\" (UID: \"97f747a8-b136-428e-8a84-106573e537db\") " pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.895417 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpcvj\" (UniqueName: \"kubernetes.io/projected/97f747a8-b136-428e-8a84-106573e537db-kube-api-access-tpcvj\") pod \"root-account-create-update-6j2lv\" (UID: \"97f747a8-b136-428e-8a84-106573e537db\") " pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.895477 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97f747a8-b136-428e-8a84-106573e537db-operator-scripts\") pod \"root-account-create-update-6j2lv\" (UID: \"97f747a8-b136-428e-8a84-106573e537db\") " pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.896782 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97f747a8-b136-428e-8a84-106573e537db-operator-scripts\") pod \"root-account-create-update-6j2lv\" (UID: \"97f747a8-b136-428e-8a84-106573e537db\") " pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.921676 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpcvj\" (UniqueName: \"kubernetes.io/projected/97f747a8-b136-428e-8a84-106573e537db-kube-api-access-tpcvj\") pod \"root-account-create-update-6j2lv\" (UID: \"97f747a8-b136-428e-8a84-106573e537db\") " pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:55 crc kubenswrapper[4803]: I0320 17:33:55.971841 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.067953 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.217270 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qs59\" (UniqueName: \"kubernetes.io/projected/d5dae348-d62b-4a26-b4f4-acc447641176-kube-api-access-2qs59\") pod \"d5dae348-d62b-4a26-b4f4-acc447641176\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.217748 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-scripts\") pod \"d5dae348-d62b-4a26-b4f4-acc447641176\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.217833 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-log-ovn\") pod \"d5dae348-d62b-4a26-b4f4-acc447641176\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.217916 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run-ovn\") pod \"d5dae348-d62b-4a26-b4f4-acc447641176\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.217947 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run\") pod \"d5dae348-d62b-4a26-b4f4-acc447641176\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.217987 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-additional-scripts\") pod \"d5dae348-d62b-4a26-b4f4-acc447641176\" (UID: \"d5dae348-d62b-4a26-b4f4-acc447641176\") " Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.218303 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "d5dae348-d62b-4a26-b4f4-acc447641176" (UID: "d5dae348-d62b-4a26-b4f4-acc447641176"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.218619 4803 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.219025 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "d5dae348-d62b-4a26-b4f4-acc447641176" (UID: "d5dae348-d62b-4a26-b4f4-acc447641176"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.219079 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "d5dae348-d62b-4a26-b4f4-acc447641176" (UID: "d5dae348-d62b-4a26-b4f4-acc447641176"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.219105 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run" (OuterVolumeSpecName: "var-run") pod "d5dae348-d62b-4a26-b4f4-acc447641176" (UID: "d5dae348-d62b-4a26-b4f4-acc447641176"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.219360 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-scripts" (OuterVolumeSpecName: "scripts") pod "d5dae348-d62b-4a26-b4f4-acc447641176" (UID: "d5dae348-d62b-4a26-b4f4-acc447641176"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.221135 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5dae348-d62b-4a26-b4f4-acc447641176-kube-api-access-2qs59" (OuterVolumeSpecName: "kube-api-access-2qs59") pod "d5dae348-d62b-4a26-b4f4-acc447641176" (UID: "d5dae348-d62b-4a26-b4f4-acc447641176"). InnerVolumeSpecName "kube-api-access-2qs59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.251209 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-6j2lv"] Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.319600 4803 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.319644 4803 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d5dae348-d62b-4a26-b4f4-acc447641176-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.319652 4803 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.319662 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qs59\" (UniqueName: \"kubernetes.io/projected/d5dae348-d62b-4a26-b4f4-acc447641176-kube-api-access-2qs59\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.319672 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5dae348-d62b-4a26-b4f4-acc447641176-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.526585 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.532001 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8d6ece75-fa3d-4695-ada9-6c5ec4b580a7-etc-swift\") pod \"swift-storage-0\" (UID: \"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7\") " pod="openstack/swift-storage-0" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.569932 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-56z85-config-tws5n" event={"ID":"d5dae348-d62b-4a26-b4f4-acc447641176","Type":"ContainerDied","Data":"a8c1eb779c9c383b4c2be05e3704c73a3911e3858671d7bd2d068b2c291bfc2f"} Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.569981 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8c1eb779c9c383b4c2be05e3704c73a3911e3858671d7bd2d068b2c291bfc2f" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.570056 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-56z85-config-tws5n" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.572785 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6j2lv" event={"ID":"97f747a8-b136-428e-8a84-106573e537db","Type":"ContainerStarted","Data":"94758365f09a0875ee0eca12212127bcd6701e737b8d48c3154a20b4ac15a0c8"} Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.572850 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6j2lv" event={"ID":"97f747a8-b136-428e-8a84-106573e537db","Type":"ContainerStarted","Data":"f2e172802b9e0a58faf46788b3e80142b215f78a79765bfcd6667b26694a5a37"} Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.573700 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 17:33:57 crc kubenswrapper[4803]: I0320 17:33:57.593333 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-6j2lv" podStartSLOduration=2.5933156520000002 podStartE2EDuration="2.593315652s" podCreationTimestamp="2026-03-20 17:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:33:57.591541484 +0000 UTC m=+1047.503133554" watchObservedRunningTime="2026-03-20 17:33:57.593315652 +0000 UTC m=+1047.504907722" Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.178954 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-56z85-config-tws5n"] Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.189586 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-56z85-config-tws5n"] Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.207167 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.430116 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.432199 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.432676 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.583068 4803 generic.go:334] "Generic (PLEG): container finished" podID="97f747a8-b136-428e-8a84-106573e537db" containerID="94758365f09a0875ee0eca12212127bcd6701e737b8d48c3154a20b4ac15a0c8" exitCode=0 Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.583185 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6j2lv" event={"ID":"97f747a8-b136-428e-8a84-106573e537db","Type":"ContainerDied","Data":"94758365f09a0875ee0eca12212127bcd6701e737b8d48c3154a20b4ac15a0c8"} Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.584334 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"4fef671094ef205ebd5379612adc686da4e212e49fb0b5867fac4106897c812a"} Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.585954 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ww57h" event={"ID":"b7c27f17-4261-4a5c-830f-687a37c483fe","Type":"ContainerStarted","Data":"eb7c4c6fe2cd60e6bc64901df51792ffc3d0aeb5cbfa1dd687e92203af9cdaf2"} Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.631823 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ww57h" podStartSLOduration=2.828382657 podStartE2EDuration="16.631802436s" podCreationTimestamp="2026-03-20 17:33:42 +0000 UTC" firstStartedPulling="2026-03-20 17:33:43.327831692 +0000 UTC m=+1033.239423772" lastFinishedPulling="2026-03-20 17:33:57.131251481 +0000 UTC m=+1047.042843551" observedRunningTime="2026-03-20 17:33:58.625836164 +0000 UTC m=+1048.537428254" watchObservedRunningTime="2026-03-20 17:33:58.631802436 +0000 UTC m=+1048.543394526" Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.770809 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:33:58 crc kubenswrapper[4803]: I0320 17:33:58.859554 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5dae348-d62b-4a26-b4f4-acc447641176" path="/var/lib/kubelet/pods/d5dae348-d62b-4a26-b4f4-acc447641176/volumes" Mar 20 17:33:59 crc kubenswrapper[4803]: I0320 17:33:59.825394 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6j2lv" Mar 20 17:33:59 crc kubenswrapper[4803]: I0320 17:33:59.969836 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97f747a8-b136-428e-8a84-106573e537db-operator-scripts\") pod \"97f747a8-b136-428e-8a84-106573e537db\" (UID: \"97f747a8-b136-428e-8a84-106573e537db\") " Mar 20 17:33:59 crc kubenswrapper[4803]: I0320 17:33:59.970027 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpcvj\" (UniqueName: \"kubernetes.io/projected/97f747a8-b136-428e-8a84-106573e537db-kube-api-access-tpcvj\") pod \"97f747a8-b136-428e-8a84-106573e537db\" (UID: \"97f747a8-b136-428e-8a84-106573e537db\") " Mar 20 17:33:59 crc kubenswrapper[4803]: I0320 17:33:59.970498 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f747a8-b136-428e-8a84-106573e537db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97f747a8-b136-428e-8a84-106573e537db" (UID: "97f747a8-b136-428e-8a84-106573e537db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:33:59 crc kubenswrapper[4803]: I0320 17:33:59.974677 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f747a8-b136-428e-8a84-106573e537db-kube-api-access-tpcvj" (OuterVolumeSpecName: "kube-api-access-tpcvj") pod "97f747a8-b136-428e-8a84-106573e537db" (UID: "97f747a8-b136-428e-8a84-106573e537db"). InnerVolumeSpecName "kube-api-access-tpcvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.071595 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97f747a8-b136-428e-8a84-106573e537db-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.071628 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpcvj\" (UniqueName: \"kubernetes.io/projected/97f747a8-b136-428e-8a84-106573e537db-kube-api-access-tpcvj\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.135272 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567134-l24k6"] Mar 20 17:34:00 crc kubenswrapper[4803]: E0320 17:34:00.135701 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f747a8-b136-428e-8a84-106573e537db" containerName="mariadb-account-create-update" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.135717 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f747a8-b136-428e-8a84-106573e537db" containerName="mariadb-account-create-update" Mar 20 17:34:00 crc kubenswrapper[4803]: E0320 17:34:00.135753 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5dae348-d62b-4a26-b4f4-acc447641176" containerName="ovn-config" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.135763 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5dae348-d62b-4a26-b4f4-acc447641176" containerName="ovn-config" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.135938 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f747a8-b136-428e-8a84-106573e537db" containerName="mariadb-account-create-update" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.135957 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5dae348-d62b-4a26-b4f4-acc447641176" containerName="ovn-config" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.136581 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-l24k6" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.139144 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.140355 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.140380 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.148324 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-l24k6"] Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.274669 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z8v4\" (UniqueName: \"kubernetes.io/projected/2b7f54dd-e657-437a-8638-63c5a5cbc8c0-kube-api-access-7z8v4\") pod \"auto-csr-approver-29567134-l24k6\" (UID: \"2b7f54dd-e657-437a-8638-63c5a5cbc8c0\") " pod="openshift-infra/auto-csr-approver-29567134-l24k6" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.376033 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z8v4\" (UniqueName: \"kubernetes.io/projected/2b7f54dd-e657-437a-8638-63c5a5cbc8c0-kube-api-access-7z8v4\") pod \"auto-csr-approver-29567134-l24k6\" (UID: \"2b7f54dd-e657-437a-8638-63c5a5cbc8c0\") " pod="openshift-infra/auto-csr-approver-29567134-l24k6" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.391400 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z8v4\" (UniqueName: \"kubernetes.io/projected/2b7f54dd-e657-437a-8638-63c5a5cbc8c0-kube-api-access-7z8v4\") pod \"auto-csr-approver-29567134-l24k6\" (UID: \"2b7f54dd-e657-437a-8638-63c5a5cbc8c0\") " pod="openshift-infra/auto-csr-approver-29567134-l24k6" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.468997 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-l24k6" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.635794 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-6j2lv" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.635774 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-6j2lv" event={"ID":"97f747a8-b136-428e-8a84-106573e537db","Type":"ContainerDied","Data":"f2e172802b9e0a58faf46788b3e80142b215f78a79765bfcd6667b26694a5a37"} Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.636731 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2e172802b9e0a58faf46788b3e80142b215f78a79765bfcd6667b26694a5a37" Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.655615 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"bceee18588b13c5e877130df0fc42bc5069c0e2f5562eb292d9f1f49d23d37ca"} Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.655665 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"416a66236e5336bca3b15857a2f29b1b74115d67dcd2eacf755b9203e57ddd40"} Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.655689 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"9fbfa15bf7d71a65394cc8c8f8b6fb5acfda082350d29788ee244d7530e645b1"} Mar 20 17:34:00 crc kubenswrapper[4803]: I0320 17:34:00.655701 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"25b8211852bbefade539cea7561f15a66c284f13c1447581ab75f0edb82b1720"} Mar 20 17:34:01 crc kubenswrapper[4803]: I0320 17:34:01.006149 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-l24k6"] Mar 20 17:34:01 crc kubenswrapper[4803]: I0320 17:34:01.667716 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-l24k6" event={"ID":"2b7f54dd-e657-437a-8638-63c5a5cbc8c0","Type":"ContainerStarted","Data":"ee9c4f20310739a7eda4e825bc98c626b6058a62691431a41356c7b656b612ed"} Mar 20 17:34:03 crc kubenswrapper[4803]: I0320 17:34:03.685352 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-l24k6" event={"ID":"2b7f54dd-e657-437a-8638-63c5a5cbc8c0","Type":"ContainerStarted","Data":"7ce7f6efeed7010297f2b259cf0e439588c084dab1c38d97c122f4f2fa496303"} Mar 20 17:34:03 crc kubenswrapper[4803]: I0320 17:34:03.695313 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"2ec0de8249565d928b674baf255ac03f27eea19eface3fef8d22f287c82287dd"} Mar 20 17:34:03 crc kubenswrapper[4803]: I0320 17:34:03.695366 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"4b934edde97d0097fe3aad08621379a790fd7dd39d8bc60a99ab1e17ff7100d4"} Mar 20 17:34:03 crc kubenswrapper[4803]: I0320 17:34:03.702812 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567134-l24k6" podStartSLOduration=1.39369201 podStartE2EDuration="3.702798114s" podCreationTimestamp="2026-03-20 17:34:00 +0000 UTC" firstStartedPulling="2026-03-20 17:34:01.010727967 +0000 UTC m=+1050.922320077" lastFinishedPulling="2026-03-20 17:34:03.319834111 +0000 UTC m=+1053.231426181" observedRunningTime="2026-03-20 17:34:03.697230143 +0000 UTC m=+1053.608822233" watchObservedRunningTime="2026-03-20 17:34:03.702798114 +0000 UTC m=+1053.614390184" Mar 20 17:34:04 crc kubenswrapper[4803]: I0320 17:34:04.706814 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"5eb7276e24bce92fb0d2a8db834b0fb43a9b8b15171116ba63e103a51295db65"} Mar 20 17:34:04 crc kubenswrapper[4803]: I0320 17:34:04.707187 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"f2b4c98f66848e18ad6028482090dcab642932a02b4f402e19ee8322d8051f2b"} Mar 20 17:34:04 crc kubenswrapper[4803]: I0320 17:34:04.708570 4803 generic.go:334] "Generic (PLEG): container finished" podID="2b7f54dd-e657-437a-8638-63c5a5cbc8c0" containerID="7ce7f6efeed7010297f2b259cf0e439588c084dab1c38d97c122f4f2fa496303" exitCode=0 Mar 20 17:34:04 crc kubenswrapper[4803]: I0320 17:34:04.708620 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-l24k6" event={"ID":"2b7f54dd-e657-437a-8638-63c5a5cbc8c0","Type":"ContainerDied","Data":"7ce7f6efeed7010297f2b259cf0e439588c084dab1c38d97c122f4f2fa496303"} Mar 20 17:34:05 crc kubenswrapper[4803]: I0320 17:34:05.725320 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"2d3430b7e3ff9457d483efab4d6114e4e94a6b2e81f1791dbc51ec891cf964c5"} Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.110735 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-l24k6" Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.296508 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z8v4\" (UniqueName: \"kubernetes.io/projected/2b7f54dd-e657-437a-8638-63c5a5cbc8c0-kube-api-access-7z8v4\") pod \"2b7f54dd-e657-437a-8638-63c5a5cbc8c0\" (UID: \"2b7f54dd-e657-437a-8638-63c5a5cbc8c0\") " Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.300878 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b7f54dd-e657-437a-8638-63c5a5cbc8c0-kube-api-access-7z8v4" (OuterVolumeSpecName: "kube-api-access-7z8v4") pod "2b7f54dd-e657-437a-8638-63c5a5cbc8c0" (UID: "2b7f54dd-e657-437a-8638-63c5a5cbc8c0"). InnerVolumeSpecName "kube-api-access-7z8v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.403658 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z8v4\" (UniqueName: \"kubernetes.io/projected/2b7f54dd-e657-437a-8638-63c5a5cbc8c0-kube-api-access-7z8v4\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.755804 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567134-l24k6" Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.755822 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567134-l24k6" event={"ID":"2b7f54dd-e657-437a-8638-63c5a5cbc8c0","Type":"ContainerDied","Data":"ee9c4f20310739a7eda4e825bc98c626b6058a62691431a41356c7b656b612ed"} Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.758165 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee9c4f20310739a7eda4e825bc98c626b6058a62691431a41356c7b656b612ed" Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.758640 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-mstjx"] Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.764751 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"90e8ac96f1e5af4cc88a9eaead13786aa121d9e601c04432b6c2350050f39996"} Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.764791 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"b35478a63d6bff81d1cb6fa178c8a647a3af44996665aefe9903721b9bd1e924"} Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.764801 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"0d5f28c827dff73882683fa97463a3478a66ab663a21da9b6f7ef2ca02d6377e"} Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.764810 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"bab359b68c6ec11e46814aa6396a8b3f48328e2216fbd4d4bce7d6134731fa57"} Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.765843 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567128-mstjx"] Mar 20 17:34:06 crc kubenswrapper[4803]: E0320 17:34:06.853624 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b7f54dd_e657_437a_8638_63c5a5cbc8c0.slice\": RecentStats: unable to find data in memory cache]" Mar 20 17:34:06 crc kubenswrapper[4803]: I0320 17:34:06.856215 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975f31fa-3161-4c3e-aa64-e1150a9ca108" path="/var/lib/kubelet/pods/975f31fa-3161-4c3e-aa64-e1150a9ca108/volumes" Mar 20 17:34:07 crc kubenswrapper[4803]: I0320 17:34:07.786125 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"a2b35c3dc8c0d3fb1849c8547f1587f69b7ad5f42d37ddcac6fce4faae1d3297"} Mar 20 17:34:07 crc kubenswrapper[4803]: I0320 17:34:07.786739 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8d6ece75-fa3d-4695-ada9-6c5ec4b580a7","Type":"ContainerStarted","Data":"96ef89a9a73216709ba1851c2b2c4dbbdcc620cc9bb24da5d407ceb4db9a229e"} Mar 20 17:34:07 crc kubenswrapper[4803]: I0320 17:34:07.789076 4803 generic.go:334] "Generic (PLEG): container finished" podID="b7c27f17-4261-4a5c-830f-687a37c483fe" containerID="eb7c4c6fe2cd60e6bc64901df51792ffc3d0aeb5cbfa1dd687e92203af9cdaf2" exitCode=0 Mar 20 17:34:07 crc kubenswrapper[4803]: I0320 17:34:07.789145 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ww57h" event={"ID":"b7c27f17-4261-4a5c-830f-687a37c483fe","Type":"ContainerDied","Data":"eb7c4c6fe2cd60e6bc64901df51792ffc3d0aeb5cbfa1dd687e92203af9cdaf2"} Mar 20 17:34:07 crc kubenswrapper[4803]: I0320 17:34:07.845635 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.592138781 podStartE2EDuration="43.845613479s" podCreationTimestamp="2026-03-20 17:33:24 +0000 UTC" firstStartedPulling="2026-03-20 17:33:58.220641261 +0000 UTC m=+1048.132233341" lastFinishedPulling="2026-03-20 17:34:05.474115959 +0000 UTC m=+1055.385708039" observedRunningTime="2026-03-20 17:34:07.844505179 +0000 UTC m=+1057.756097279" watchObservedRunningTime="2026-03-20 17:34:07.845613479 +0000 UTC m=+1057.757205549" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.174016 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-tqqrz"] Mar 20 17:34:08 crc kubenswrapper[4803]: E0320 17:34:08.174352 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b7f54dd-e657-437a-8638-63c5a5cbc8c0" containerName="oc" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.174369 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b7f54dd-e657-437a-8638-63c5a5cbc8c0" containerName="oc" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.174603 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b7f54dd-e657-437a-8638-63c5a5cbc8c0" containerName="oc" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.175370 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.177644 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.198307 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-tqqrz"] Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.244037 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-config\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.244093 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.244177 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.244404 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.244586 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.244660 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvr4s\" (UniqueName: \"kubernetes.io/projected/56c26921-27aa-4a0b-a98c-a6cf7478920e-kube-api-access-hvr4s\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.346065 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvr4s\" (UniqueName: \"kubernetes.io/projected/56c26921-27aa-4a0b-a98c-a6cf7478920e-kube-api-access-hvr4s\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.346124 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-config\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.346170 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.346213 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.346268 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.346293 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.347098 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.347098 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-config\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.347227 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.347377 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.347705 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.367426 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvr4s\" (UniqueName: \"kubernetes.io/projected/56c26921-27aa-4a0b-a98c-a6cf7478920e-kube-api-access-hvr4s\") pod \"dnsmasq-dns-5c79d794d7-tqqrz\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.431096 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.512490 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.813693 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-9kd4s"] Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.815000 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.824255 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9kd4s"] Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.854398 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57zsb\" (UniqueName: \"kubernetes.io/projected/6ea23ab0-9514-4256-aa1b-d477da1a19fe-kube-api-access-57zsb\") pod \"cinder-db-create-9kd4s\" (UID: \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\") " pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.854493 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ea23ab0-9514-4256-aa1b-d477da1a19fe-operator-scripts\") pod \"cinder-db-create-9kd4s\" (UID: \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\") " pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.901252 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a62f-account-create-update-gxqzm"] Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.902181 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.908487 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.926640 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a62f-account-create-update-gxqzm"] Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.955700 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57zsb\" (UniqueName: \"kubernetes.io/projected/6ea23ab0-9514-4256-aa1b-d477da1a19fe-kube-api-access-57zsb\") pod \"cinder-db-create-9kd4s\" (UID: \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\") " pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.955858 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ea23ab0-9514-4256-aa1b-d477da1a19fe-operator-scripts\") pod \"cinder-db-create-9kd4s\" (UID: \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\") " pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.957166 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ea23ab0-9514-4256-aa1b-d477da1a19fe-operator-scripts\") pod \"cinder-db-create-9kd4s\" (UID: \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\") " pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:08 crc kubenswrapper[4803]: I0320 17:34:08.978023 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57zsb\" (UniqueName: \"kubernetes.io/projected/6ea23ab0-9514-4256-aa1b-d477da1a19fe-kube-api-access-57zsb\") pod \"cinder-db-create-9kd4s\" (UID: \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\") " pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.023751 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h24x7"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.025950 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.036648 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h24x7"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.057149 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afb3f389-31c1-4ada-838c-ccef884cc082-operator-scripts\") pod \"cinder-a62f-account-create-update-gxqzm\" (UID: \"afb3f389-31c1-4ada-838c-ccef884cc082\") " pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.057559 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj72b\" (UniqueName: \"kubernetes.io/projected/afb3f389-31c1-4ada-838c-ccef884cc082-kube-api-access-lj72b\") pod \"cinder-a62f-account-create-update-gxqzm\" (UID: \"afb3f389-31c1-4ada-838c-ccef884cc082\") " pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.070823 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-tqqrz"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.140062 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-tq5tc"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.141182 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.142008 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.159862 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj72b\" (UniqueName: \"kubernetes.io/projected/afb3f389-31c1-4ada-838c-ccef884cc082-kube-api-access-lj72b\") pod \"cinder-a62f-account-create-update-gxqzm\" (UID: \"afb3f389-31c1-4ada-838c-ccef884cc082\") " pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.160273 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afb3f389-31c1-4ada-838c-ccef884cc082-operator-scripts\") pod \"cinder-a62f-account-create-update-gxqzm\" (UID: \"afb3f389-31c1-4ada-838c-ccef884cc082\") " pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.160395 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f19fe4f0-4b51-440d-82ad-28541b098fc4-operator-scripts\") pod \"barbican-db-create-h24x7\" (UID: \"f19fe4f0-4b51-440d-82ad-28541b098fc4\") " pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.160447 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ht2\" (UniqueName: \"kubernetes.io/projected/f19fe4f0-4b51-440d-82ad-28541b098fc4-kube-api-access-47ht2\") pod \"barbican-db-create-h24x7\" (UID: \"f19fe4f0-4b51-440d-82ad-28541b098fc4\") " pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.161553 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afb3f389-31c1-4ada-838c-ccef884cc082-operator-scripts\") pod \"cinder-a62f-account-create-update-gxqzm\" (UID: \"afb3f389-31c1-4ada-838c-ccef884cc082\") " pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.181763 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-4eea-account-create-update-lxf2n"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.183027 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.187661 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.191705 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tq5tc"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.212586 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4eea-account-create-update-lxf2n"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.215095 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj72b\" (UniqueName: \"kubernetes.io/projected/afb3f389-31c1-4ada-838c-ccef884cc082-kube-api-access-lj72b\") pod \"cinder-a62f-account-create-update-gxqzm\" (UID: \"afb3f389-31c1-4ada-838c-ccef884cc082\") " pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.224680 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.242904 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-sc6nk"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.243867 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.249982 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.250144 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.250985 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.251156 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q8lcw" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.264748 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1d16ab-bd85-4b4b-8862-19f134432523-operator-scripts\") pod \"barbican-4eea-account-create-update-lxf2n\" (UID: \"bf1d16ab-bd85-4b4b-8862-19f134432523\") " pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.265243 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b50998-7271-4d76-bf90-7a23ab8ae295-operator-scripts\") pod \"neutron-db-create-tq5tc\" (UID: \"d9b50998-7271-4d76-bf90-7a23ab8ae295\") " pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.265353 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv8bb\" (UniqueName: \"kubernetes.io/projected/d9b50998-7271-4d76-bf90-7a23ab8ae295-kube-api-access-qv8bb\") pod \"neutron-db-create-tq5tc\" (UID: \"d9b50998-7271-4d76-bf90-7a23ab8ae295\") " pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.265450 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-combined-ca-bundle\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.265582 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7plh2\" (UniqueName: \"kubernetes.io/projected/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-kube-api-access-7plh2\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.265683 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8srg\" (UniqueName: \"kubernetes.io/projected/bf1d16ab-bd85-4b4b-8862-19f134432523-kube-api-access-n8srg\") pod \"barbican-4eea-account-create-update-lxf2n\" (UID: \"bf1d16ab-bd85-4b4b-8862-19f134432523\") " pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.265863 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f19fe4f0-4b51-440d-82ad-28541b098fc4-operator-scripts\") pod \"barbican-db-create-h24x7\" (UID: \"f19fe4f0-4b51-440d-82ad-28541b098fc4\") " pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.265897 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-config-data\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.265922 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ht2\" (UniqueName: \"kubernetes.io/projected/f19fe4f0-4b51-440d-82ad-28541b098fc4-kube-api-access-47ht2\") pod \"barbican-db-create-h24x7\" (UID: \"f19fe4f0-4b51-440d-82ad-28541b098fc4\") " pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.267187 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f19fe4f0-4b51-440d-82ad-28541b098fc4-operator-scripts\") pod \"barbican-db-create-h24x7\" (UID: \"f19fe4f0-4b51-440d-82ad-28541b098fc4\") " pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.273079 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-sc6nk"] Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.294588 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ht2\" (UniqueName: \"kubernetes.io/projected/f19fe4f0-4b51-440d-82ad-28541b098fc4-kube-api-access-47ht2\") pod \"barbican-db-create-h24x7\" (UID: \"f19fe4f0-4b51-440d-82ad-28541b098fc4\") " pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.349378 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.367610 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-config-data\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.367687 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1d16ab-bd85-4b4b-8862-19f134432523-operator-scripts\") pod \"barbican-4eea-account-create-update-lxf2n\" (UID: \"bf1d16ab-bd85-4b4b-8862-19f134432523\") " pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.367718 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b50998-7271-4d76-bf90-7a23ab8ae295-operator-scripts\") pod \"neutron-db-create-tq5tc\" (UID: \"d9b50998-7271-4d76-bf90-7a23ab8ae295\") " pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.367752 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv8bb\" (UniqueName: \"kubernetes.io/projected/d9b50998-7271-4d76-bf90-7a23ab8ae295-kube-api-access-qv8bb\") pod \"neutron-db-create-tq5tc\" (UID: \"d9b50998-7271-4d76-bf90-7a23ab8ae295\") " pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.367778 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-combined-ca-bundle\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.367799 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7plh2\" (UniqueName: \"kubernetes.io/projected/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-kube-api-access-7plh2\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.367818 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8srg\" (UniqueName: \"kubernetes.io/projected/bf1d16ab-bd85-4b4b-8862-19f134432523-kube-api-access-n8srg\") pod \"barbican-4eea-account-create-update-lxf2n\" (UID: \"bf1d16ab-bd85-4b4b-8862-19f134432523\") " pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.370142 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1d16ab-bd85-4b4b-8862-19f134432523-operator-scripts\") pod \"barbican-4eea-account-create-update-lxf2n\" (UID: \"bf1d16ab-bd85-4b4b-8862-19f134432523\") " pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.370644 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b50998-7271-4d76-bf90-7a23ab8ae295-operator-scripts\") pod \"neutron-db-create-tq5tc\" (UID: \"d9b50998-7271-4d76-bf90-7a23ab8ae295\") " pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.374218 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-combined-ca-bundle\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.377898 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-config-data\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.387300 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8srg\" (UniqueName: \"kubernetes.io/projected/bf1d16ab-bd85-4b4b-8862-19f134432523-kube-api-access-n8srg\") pod \"barbican-4eea-account-create-update-lxf2n\" (UID: \"bf1d16ab-bd85-4b4b-8862-19f134432523\") " pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:09 crc kubenswrapper[4803]: I0320 17:34:09.391184 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7plh2\" (UniqueName: \"kubernetes.io/projected/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-kube-api-access-7plh2\") pod \"keystone-db-sync-sc6nk\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.404796 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv8bb\" (UniqueName: \"kubernetes.io/projected/d9b50998-7271-4d76-bf90-7a23ab8ae295-kube-api-access-qv8bb\") pod \"neutron-db-create-tq5tc\" (UID: \"d9b50998-7271-4d76-bf90-7a23ab8ae295\") " pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.424778 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e380-account-create-update-9mlv9"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.430440 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.433949 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.448753 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e380-account-create-update-9mlv9"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.460516 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ww57h" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.501889 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.558010 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.574395 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8rqg\" (UniqueName: \"kubernetes.io/projected/b7c27f17-4261-4a5c-830f-687a37c483fe-kube-api-access-v8rqg\") pod \"b7c27f17-4261-4a5c-830f-687a37c483fe\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.574474 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-db-sync-config-data\") pod \"b7c27f17-4261-4a5c-830f-687a37c483fe\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.574573 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-config-data\") pod \"b7c27f17-4261-4a5c-830f-687a37c483fe\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.574615 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-combined-ca-bundle\") pod \"b7c27f17-4261-4a5c-830f-687a37c483fe\" (UID: \"b7c27f17-4261-4a5c-830f-687a37c483fe\") " Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.574988 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mg9q\" (UniqueName: \"kubernetes.io/projected/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-kube-api-access-8mg9q\") pod \"neutron-e380-account-create-update-9mlv9\" (UID: \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\") " pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.575026 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-operator-scripts\") pod \"neutron-e380-account-create-update-9mlv9\" (UID: \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\") " pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.579115 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7c27f17-4261-4a5c-830f-687a37c483fe-kube-api-access-v8rqg" (OuterVolumeSpecName: "kube-api-access-v8rqg") pod "b7c27f17-4261-4a5c-830f-687a37c483fe" (UID: "b7c27f17-4261-4a5c-830f-687a37c483fe"). InnerVolumeSpecName "kube-api-access-v8rqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.580204 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b7c27f17-4261-4a5c-830f-687a37c483fe" (UID: "b7c27f17-4261-4a5c-830f-687a37c483fe"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.607500 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.628042 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7c27f17-4261-4a5c-830f-687a37c483fe" (UID: "b7c27f17-4261-4a5c-830f-687a37c483fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.651944 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-config-data" (OuterVolumeSpecName: "config-data") pod "b7c27f17-4261-4a5c-830f-687a37c483fe" (UID: "b7c27f17-4261-4a5c-830f-687a37c483fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.677286 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-operator-scripts\") pod \"neutron-e380-account-create-update-9mlv9\" (UID: \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\") " pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.677429 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mg9q\" (UniqueName: \"kubernetes.io/projected/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-kube-api-access-8mg9q\") pod \"neutron-e380-account-create-update-9mlv9\" (UID: \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\") " pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.677473 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8rqg\" (UniqueName: \"kubernetes.io/projected/b7c27f17-4261-4a5c-830f-687a37c483fe-kube-api-access-v8rqg\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.677485 4803 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.677494 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.677503 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7c27f17-4261-4a5c-830f-687a37c483fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.678373 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-operator-scripts\") pod \"neutron-e380-account-create-update-9mlv9\" (UID: \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\") " pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.699869 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mg9q\" (UniqueName: \"kubernetes.io/projected/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-kube-api-access-8mg9q\") pod \"neutron-e380-account-create-update-9mlv9\" (UID: \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\") " pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.784016 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.807964 4803 generic.go:334] "Generic (PLEG): container finished" podID="56c26921-27aa-4a0b-a98c-a6cf7478920e" containerID="6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c" exitCode=0 Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.808030 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" event={"ID":"56c26921-27aa-4a0b-a98c-a6cf7478920e","Type":"ContainerDied","Data":"6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.808167 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" event={"ID":"56c26921-27aa-4a0b-a98c-a6cf7478920e","Type":"ContainerStarted","Data":"abf1d1d51c2e0eb750e57b30656f2d84970863819043274758db045879d14733"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.811868 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ww57h" event={"ID":"b7c27f17-4261-4a5c-830f-687a37c483fe","Type":"ContainerDied","Data":"217a25426b35e6093fa98afb6fe5fd34427696a6ff86960ec4be6e1dbf35bd67"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.811887 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="217a25426b35e6093fa98afb6fe5fd34427696a6ff86960ec4be6e1dbf35bd67" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:09.811932 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ww57h" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.241291 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-tqqrz"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.276523 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wgnp4"] Mar 20 17:34:10 crc kubenswrapper[4803]: E0320 17:34:10.276830 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7c27f17-4261-4a5c-830f-687a37c483fe" containerName="glance-db-sync" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.276846 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7c27f17-4261-4a5c-830f-687a37c483fe" containerName="glance-db-sync" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.277003 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7c27f17-4261-4a5c-830f-687a37c483fe" containerName="glance-db-sync" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.277775 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.288791 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wgnp4"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.326923 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9kd4s"] Mar 20 17:34:10 crc kubenswrapper[4803]: W0320 17:34:10.333238 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea23ab0_9514_4256_aa1b_d477da1a19fe.slice/crio-67cc3a27afbeca9e26460275711efdb110eb338258fbaa047c9c40686ab1639b WatchSource:0}: Error finding container 67cc3a27afbeca9e26460275711efdb110eb338258fbaa047c9c40686ab1639b: Status 404 returned error can't find the container with id 67cc3a27afbeca9e26460275711efdb110eb338258fbaa047c9c40686ab1639b Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.348419 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a62f-account-create-update-gxqzm"] Mar 20 17:34:10 crc kubenswrapper[4803]: W0320 17:34:10.357062 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb3f389_31c1_4ada_838c_ccef884cc082.slice/crio-4b617029315685824510bd6b26475a35037d8d90693ac99b87d56f3ed15cd88f WatchSource:0}: Error finding container 4b617029315685824510bd6b26475a35037d8d90693ac99b87d56f3ed15cd88f: Status 404 returned error can't find the container with id 4b617029315685824510bd6b26475a35037d8d90693ac99b87d56f3ed15cd88f Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.388575 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdkml\" (UniqueName: \"kubernetes.io/projected/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-kube-api-access-xdkml\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.388657 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.388695 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.388732 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.388796 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-config\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.388843 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.430479 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-4eea-account-create-update-lxf2n"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.482095 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-sc6nk"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.490185 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.490322 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdkml\" (UniqueName: \"kubernetes.io/projected/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-kube-api-access-xdkml\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.490400 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.490437 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.490495 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.490565 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-config\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.491328 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.491553 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-config\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.491590 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.491858 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h24x7"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.491916 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.492080 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.510102 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdkml\" (UniqueName: \"kubernetes.io/projected/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-kube-api-access-xdkml\") pod \"dnsmasq-dns-5f59b8f679-wgnp4\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.514200 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e380-account-create-update-9mlv9"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.523597 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-tq5tc"] Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.597433 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.819567 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tq5tc" event={"ID":"d9b50998-7271-4d76-bf90-7a23ab8ae295","Type":"ContainerStarted","Data":"711cc039f843e77241791e6addc80220f37d1a0754d82616a887b84b66939943"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.820952 4803 generic.go:334] "Generic (PLEG): container finished" podID="6ea23ab0-9514-4256-aa1b-d477da1a19fe" containerID="2ed94c68c07d9b2f49e25944f7987b5350638170c181ca2258d9ccc97cf4ca3e" exitCode=0 Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.821294 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9kd4s" event={"ID":"6ea23ab0-9514-4256-aa1b-d477da1a19fe","Type":"ContainerDied","Data":"2ed94c68c07d9b2f49e25944f7987b5350638170c181ca2258d9ccc97cf4ca3e"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.821323 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9kd4s" event={"ID":"6ea23ab0-9514-4256-aa1b-d477da1a19fe","Type":"ContainerStarted","Data":"67cc3a27afbeca9e26460275711efdb110eb338258fbaa047c9c40686ab1639b"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.826039 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e380-account-create-update-9mlv9" event={"ID":"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc","Type":"ContainerStarted","Data":"9db5363bd6e9c912583788be4196d29f56894bfed0c8937d099a2322778e753d"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.828098 4803 generic.go:334] "Generic (PLEG): container finished" podID="afb3f389-31c1-4ada-838c-ccef884cc082" containerID="346af28290c06d13505c4d476f70328afa1c19d4692c00d605035136c8f87c9b" exitCode=0 Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.828134 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a62f-account-create-update-gxqzm" event={"ID":"afb3f389-31c1-4ada-838c-ccef884cc082","Type":"ContainerDied","Data":"346af28290c06d13505c4d476f70328afa1c19d4692c00d605035136c8f87c9b"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.828162 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a62f-account-create-update-gxqzm" event={"ID":"afb3f389-31c1-4ada-838c-ccef884cc082","Type":"ContainerStarted","Data":"4b617029315685824510bd6b26475a35037d8d90693ac99b87d56f3ed15cd88f"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.832633 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4eea-account-create-update-lxf2n" event={"ID":"bf1d16ab-bd85-4b4b-8862-19f134432523","Type":"ContainerStarted","Data":"9193544e601958e179a92e801e9e87a8b4933b6fdee942b4555283766e602a05"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.839300 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-sc6nk" event={"ID":"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2","Type":"ContainerStarted","Data":"dcf9755f34949fae1c1bd7fd169dfa11ec5caa41b1da16c8ebd0413d81188d96"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.843485 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" event={"ID":"56c26921-27aa-4a0b-a98c-a6cf7478920e","Type":"ContainerStarted","Data":"3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.843679 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.845150 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h24x7" event={"ID":"f19fe4f0-4b51-440d-82ad-28541b098fc4","Type":"ContainerStarted","Data":"e296ec6b9eb50f0f62b3998f21aff1fb38ef791836f65e3c2fabbc4943a11582"} Mar 20 17:34:10 crc kubenswrapper[4803]: I0320 17:34:10.964709 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" podStartSLOduration=2.964690929 podStartE2EDuration="2.964690929s" podCreationTimestamp="2026-03-20 17:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:10.960155786 +0000 UTC m=+1060.871747876" watchObservedRunningTime="2026-03-20 17:34:10.964690929 +0000 UTC m=+1060.876282999" Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.057240 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wgnp4"] Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.857597 4803 generic.go:334] "Generic (PLEG): container finished" podID="9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc" containerID="ab2bef885a472f8999e3ab99d73dad338bfd52e3c51613776827341946719d14" exitCode=0 Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.857679 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e380-account-create-update-9mlv9" event={"ID":"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc","Type":"ContainerDied","Data":"ab2bef885a472f8999e3ab99d73dad338bfd52e3c51613776827341946719d14"} Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.859580 4803 generic.go:334] "Generic (PLEG): container finished" podID="bf1d16ab-bd85-4b4b-8862-19f134432523" containerID="77674519c2307d61508c3fcc6ae48715c4d07ed315245c2b437b24b6d24d20e4" exitCode=0 Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.859639 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4eea-account-create-update-lxf2n" event={"ID":"bf1d16ab-bd85-4b4b-8862-19f134432523","Type":"ContainerDied","Data":"77674519c2307d61508c3fcc6ae48715c4d07ed315245c2b437b24b6d24d20e4"} Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.864371 4803 generic.go:334] "Generic (PLEG): container finished" podID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerID="86006e5fe143a864b511d8aa7f54dd3bd571cffd52237d95957d27f821034591" exitCode=0 Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.864471 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" event={"ID":"b6450f12-4337-43e5-b4e8-817e6b9a8d8f","Type":"ContainerDied","Data":"86006e5fe143a864b511d8aa7f54dd3bd571cffd52237d95957d27f821034591"} Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.864502 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" event={"ID":"b6450f12-4337-43e5-b4e8-817e6b9a8d8f","Type":"ContainerStarted","Data":"1a69aa12ee69141d06d700f260cc940ca9dba6792d61d0eae8c70d054661a5e6"} Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.867743 4803 generic.go:334] "Generic (PLEG): container finished" podID="f19fe4f0-4b51-440d-82ad-28541b098fc4" containerID="45b0dbf2ad3242b1295eaee87dce821ddb72767b65fa42e6566430ee2ec75449" exitCode=0 Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.867834 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h24x7" event={"ID":"f19fe4f0-4b51-440d-82ad-28541b098fc4","Type":"ContainerDied","Data":"45b0dbf2ad3242b1295eaee87dce821ddb72767b65fa42e6566430ee2ec75449"} Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.876993 4803 generic.go:334] "Generic (PLEG): container finished" podID="d9b50998-7271-4d76-bf90-7a23ab8ae295" containerID="1950163278c61edad4a406e974e57dc2848599bf8e0a7c395668f62546453be9" exitCode=0 Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.877329 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" podUID="56c26921-27aa-4a0b-a98c-a6cf7478920e" containerName="dnsmasq-dns" containerID="cri-o://3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14" gracePeriod=10 Mar 20 17:34:11 crc kubenswrapper[4803]: I0320 17:34:11.877839 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tq5tc" event={"ID":"d9b50998-7271-4d76-bf90-7a23ab8ae295","Type":"ContainerDied","Data":"1950163278c61edad4a406e974e57dc2848599bf8e0a7c395668f62546453be9"} Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.331840 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.386780 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.445005 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ea23ab0-9514-4256-aa1b-d477da1a19fe-operator-scripts\") pod \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\" (UID: \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\") " Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.445150 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57zsb\" (UniqueName: \"kubernetes.io/projected/6ea23ab0-9514-4256-aa1b-d477da1a19fe-kube-api-access-57zsb\") pod \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\" (UID: \"6ea23ab0-9514-4256-aa1b-d477da1a19fe\") " Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.445736 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea23ab0-9514-4256-aa1b-d477da1a19fe-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ea23ab0-9514-4256-aa1b-d477da1a19fe" (UID: "6ea23ab0-9514-4256-aa1b-d477da1a19fe"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.453087 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea23ab0-9514-4256-aa1b-d477da1a19fe-kube-api-access-57zsb" (OuterVolumeSpecName: "kube-api-access-57zsb") pod "6ea23ab0-9514-4256-aa1b-d477da1a19fe" (UID: "6ea23ab0-9514-4256-aa1b-d477da1a19fe"). InnerVolumeSpecName "kube-api-access-57zsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.558210 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj72b\" (UniqueName: \"kubernetes.io/projected/afb3f389-31c1-4ada-838c-ccef884cc082-kube-api-access-lj72b\") pod \"afb3f389-31c1-4ada-838c-ccef884cc082\" (UID: \"afb3f389-31c1-4ada-838c-ccef884cc082\") " Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.558338 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afb3f389-31c1-4ada-838c-ccef884cc082-operator-scripts\") pod \"afb3f389-31c1-4ada-838c-ccef884cc082\" (UID: \"afb3f389-31c1-4ada-838c-ccef884cc082\") " Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.558765 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ea23ab0-9514-4256-aa1b-d477da1a19fe-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.558779 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57zsb\" (UniqueName: \"kubernetes.io/projected/6ea23ab0-9514-4256-aa1b-d477da1a19fe-kube-api-access-57zsb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.559782 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb3f389-31c1-4ada-838c-ccef884cc082-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afb3f389-31c1-4ada-838c-ccef884cc082" (UID: "afb3f389-31c1-4ada-838c-ccef884cc082"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.561180 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb3f389-31c1-4ada-838c-ccef884cc082-kube-api-access-lj72b" (OuterVolumeSpecName: "kube-api-access-lj72b") pod "afb3f389-31c1-4ada-838c-ccef884cc082" (UID: "afb3f389-31c1-4ada-838c-ccef884cc082"). InnerVolumeSpecName "kube-api-access-lj72b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.660005 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj72b\" (UniqueName: \"kubernetes.io/projected/afb3f389-31c1-4ada-838c-ccef884cc082-kube-api-access-lj72b\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.660043 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afb3f389-31c1-4ada-838c-ccef884cc082-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.863505 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.898153 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" event={"ID":"b6450f12-4337-43e5-b4e8-817e6b9a8d8f","Type":"ContainerStarted","Data":"ba659fc0cdd28b23c62059f98ea06997fdfba92de1c809f239614a6726b8efd7"} Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.898283 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.899873 4803 generic.go:334] "Generic (PLEG): container finished" podID="56c26921-27aa-4a0b-a98c-a6cf7478920e" containerID="3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14" exitCode=0 Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.899933 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" event={"ID":"56c26921-27aa-4a0b-a98c-a6cf7478920e","Type":"ContainerDied","Data":"3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14"} Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.899959 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" event={"ID":"56c26921-27aa-4a0b-a98c-a6cf7478920e","Type":"ContainerDied","Data":"abf1d1d51c2e0eb750e57b30656f2d84970863819043274758db045879d14733"} Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.899975 4803 scope.go:117] "RemoveContainer" containerID="3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.899976 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-tqqrz" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.901118 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9kd4s" event={"ID":"6ea23ab0-9514-4256-aa1b-d477da1a19fe","Type":"ContainerDied","Data":"67cc3a27afbeca9e26460275711efdb110eb338258fbaa047c9c40686ab1639b"} Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.901137 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67cc3a27afbeca9e26460275711efdb110eb338258fbaa047c9c40686ab1639b" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.901179 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9kd4s" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.903649 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a62f-account-create-update-gxqzm" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.904040 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a62f-account-create-update-gxqzm" event={"ID":"afb3f389-31c1-4ada-838c-ccef884cc082","Type":"ContainerDied","Data":"4b617029315685824510bd6b26475a35037d8d90693ac99b87d56f3ed15cd88f"} Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.904057 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b617029315685824510bd6b26475a35037d8d90693ac99b87d56f3ed15cd88f" Mar 20 17:34:12 crc kubenswrapper[4803]: I0320 17:34:12.926494 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" podStartSLOduration=2.926471732 podStartE2EDuration="2.926471732s" podCreationTimestamp="2026-03-20 17:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:12.919289228 +0000 UTC m=+1062.830881298" watchObservedRunningTime="2026-03-20 17:34:12.926471732 +0000 UTC m=+1062.838063812" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.066764 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-nb\") pod \"56c26921-27aa-4a0b-a98c-a6cf7478920e\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.066845 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-sb\") pod \"56c26921-27aa-4a0b-a98c-a6cf7478920e\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.066892 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-swift-storage-0\") pod \"56c26921-27aa-4a0b-a98c-a6cf7478920e\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.066959 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvr4s\" (UniqueName: \"kubernetes.io/projected/56c26921-27aa-4a0b-a98c-a6cf7478920e-kube-api-access-hvr4s\") pod \"56c26921-27aa-4a0b-a98c-a6cf7478920e\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.066982 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-svc\") pod \"56c26921-27aa-4a0b-a98c-a6cf7478920e\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.067004 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-config\") pod \"56c26921-27aa-4a0b-a98c-a6cf7478920e\" (UID: \"56c26921-27aa-4a0b-a98c-a6cf7478920e\") " Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.078517 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c26921-27aa-4a0b-a98c-a6cf7478920e-kube-api-access-hvr4s" (OuterVolumeSpecName: "kube-api-access-hvr4s") pod "56c26921-27aa-4a0b-a98c-a6cf7478920e" (UID: "56c26921-27aa-4a0b-a98c-a6cf7478920e"). InnerVolumeSpecName "kube-api-access-hvr4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.106919 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-config" (OuterVolumeSpecName: "config") pod "56c26921-27aa-4a0b-a98c-a6cf7478920e" (UID: "56c26921-27aa-4a0b-a98c-a6cf7478920e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.122218 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56c26921-27aa-4a0b-a98c-a6cf7478920e" (UID: "56c26921-27aa-4a0b-a98c-a6cf7478920e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.123214 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56c26921-27aa-4a0b-a98c-a6cf7478920e" (UID: "56c26921-27aa-4a0b-a98c-a6cf7478920e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.151111 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "56c26921-27aa-4a0b-a98c-a6cf7478920e" (UID: "56c26921-27aa-4a0b-a98c-a6cf7478920e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.151440 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56c26921-27aa-4a0b-a98c-a6cf7478920e" (UID: "56c26921-27aa-4a0b-a98c-a6cf7478920e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.170153 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.170192 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.170203 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.170215 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.170226 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvr4s\" (UniqueName: \"kubernetes.io/projected/56c26921-27aa-4a0b-a98c-a6cf7478920e-kube-api-access-hvr4s\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.170237 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56c26921-27aa-4a0b-a98c-a6cf7478920e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.253076 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-tqqrz"] Mar 20 17:34:13 crc kubenswrapper[4803]: I0320 17:34:13.294256 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-tqqrz"] Mar 20 17:34:14 crc kubenswrapper[4803]: I0320 17:34:14.216071 4803 scope.go:117] "RemoveContainer" containerID="755c8f2ba2febb53f0d50b53f01b630e5059c43f04aea417d4b341c0e21ff85f" Mar 20 17:34:14 crc kubenswrapper[4803]: I0320 17:34:14.888825 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c26921-27aa-4a0b-a98c-a6cf7478920e" path="/var/lib/kubelet/pods/56c26921-27aa-4a0b-a98c-a6cf7478920e/volumes" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.087156 4803 scope.go:117] "RemoveContainer" containerID="6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.236350 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.244251 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.292175 4803 scope.go:117] "RemoveContainer" containerID="3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14" Mar 20 17:34:16 crc kubenswrapper[4803]: E0320 17:34:16.292971 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14\": container with ID starting with 3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14 not found: ID does not exist" containerID="3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.293003 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14"} err="failed to get container status \"3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14\": rpc error: code = NotFound desc = could not find container \"3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14\": container with ID starting with 3de9c0fe97d9c2c400df485831ba4932a7be0ae9cda09b9023ad4944a19b2b14 not found: ID does not exist" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.293023 4803 scope.go:117] "RemoveContainer" containerID="6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c" Mar 20 17:34:16 crc kubenswrapper[4803]: E0320 17:34:16.293359 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c\": container with ID starting with 6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c not found: ID does not exist" containerID="6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.293379 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c"} err="failed to get container status \"6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c\": rpc error: code = NotFound desc = could not find container \"6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c\": container with ID starting with 6dfda01122b246fab22f0eaf2c09b7f065293afdc283deb8b4768178b1fe6f6c not found: ID does not exist" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.302439 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.306934 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.424151 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1d16ab-bd85-4b4b-8862-19f134432523-operator-scripts\") pod \"bf1d16ab-bd85-4b4b-8862-19f134432523\" (UID: \"bf1d16ab-bd85-4b4b-8862-19f134432523\") " Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.424192 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv8bb\" (UniqueName: \"kubernetes.io/projected/d9b50998-7271-4d76-bf90-7a23ab8ae295-kube-api-access-qv8bb\") pod \"d9b50998-7271-4d76-bf90-7a23ab8ae295\" (UID: \"d9b50998-7271-4d76-bf90-7a23ab8ae295\") " Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.424214 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b50998-7271-4d76-bf90-7a23ab8ae295-operator-scripts\") pod \"d9b50998-7271-4d76-bf90-7a23ab8ae295\" (UID: \"d9b50998-7271-4d76-bf90-7a23ab8ae295\") " Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.424232 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-operator-scripts\") pod \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\" (UID: \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\") " Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.424296 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f19fe4f0-4b51-440d-82ad-28541b098fc4-operator-scripts\") pod \"f19fe4f0-4b51-440d-82ad-28541b098fc4\" (UID: \"f19fe4f0-4b51-440d-82ad-28541b098fc4\") " Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.424329 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mg9q\" (UniqueName: \"kubernetes.io/projected/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-kube-api-access-8mg9q\") pod \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\" (UID: \"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc\") " Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.424349 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47ht2\" (UniqueName: \"kubernetes.io/projected/f19fe4f0-4b51-440d-82ad-28541b098fc4-kube-api-access-47ht2\") pod \"f19fe4f0-4b51-440d-82ad-28541b098fc4\" (UID: \"f19fe4f0-4b51-440d-82ad-28541b098fc4\") " Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.424396 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8srg\" (UniqueName: \"kubernetes.io/projected/bf1d16ab-bd85-4b4b-8862-19f134432523-kube-api-access-n8srg\") pod \"bf1d16ab-bd85-4b4b-8862-19f134432523\" (UID: \"bf1d16ab-bd85-4b4b-8862-19f134432523\") " Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.425757 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9b50998-7271-4d76-bf90-7a23ab8ae295-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d9b50998-7271-4d76-bf90-7a23ab8ae295" (UID: "d9b50998-7271-4d76-bf90-7a23ab8ae295"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.425828 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f19fe4f0-4b51-440d-82ad-28541b098fc4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f19fe4f0-4b51-440d-82ad-28541b098fc4" (UID: "f19fe4f0-4b51-440d-82ad-28541b098fc4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.425990 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf1d16ab-bd85-4b4b-8862-19f134432523-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf1d16ab-bd85-4b4b-8862-19f134432523" (UID: "bf1d16ab-bd85-4b4b-8862-19f134432523"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.426037 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc" (UID: "9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.429948 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f19fe4f0-4b51-440d-82ad-28541b098fc4-kube-api-access-47ht2" (OuterVolumeSpecName: "kube-api-access-47ht2") pod "f19fe4f0-4b51-440d-82ad-28541b098fc4" (UID: "f19fe4f0-4b51-440d-82ad-28541b098fc4"). InnerVolumeSpecName "kube-api-access-47ht2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.430012 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf1d16ab-bd85-4b4b-8862-19f134432523-kube-api-access-n8srg" (OuterVolumeSpecName: "kube-api-access-n8srg") pod "bf1d16ab-bd85-4b4b-8862-19f134432523" (UID: "bf1d16ab-bd85-4b4b-8862-19f134432523"). InnerVolumeSpecName "kube-api-access-n8srg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.430042 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-kube-api-access-8mg9q" (OuterVolumeSpecName: "kube-api-access-8mg9q") pod "9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc" (UID: "9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc"). InnerVolumeSpecName "kube-api-access-8mg9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.430721 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b50998-7271-4d76-bf90-7a23ab8ae295-kube-api-access-qv8bb" (OuterVolumeSpecName: "kube-api-access-qv8bb") pod "d9b50998-7271-4d76-bf90-7a23ab8ae295" (UID: "d9b50998-7271-4d76-bf90-7a23ab8ae295"). InnerVolumeSpecName "kube-api-access-qv8bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.525753 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf1d16ab-bd85-4b4b-8862-19f134432523-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.525798 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv8bb\" (UniqueName: \"kubernetes.io/projected/d9b50998-7271-4d76-bf90-7a23ab8ae295-kube-api-access-qv8bb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.525809 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d9b50998-7271-4d76-bf90-7a23ab8ae295-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.525818 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.525829 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f19fe4f0-4b51-440d-82ad-28541b098fc4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.525836 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mg9q\" (UniqueName: \"kubernetes.io/projected/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc-kube-api-access-8mg9q\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.525860 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47ht2\" (UniqueName: \"kubernetes.io/projected/f19fe4f0-4b51-440d-82ad-28541b098fc4-kube-api-access-47ht2\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.525868 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8srg\" (UniqueName: \"kubernetes.io/projected/bf1d16ab-bd85-4b4b-8862-19f134432523-kube-api-access-n8srg\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.954832 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e380-account-create-update-9mlv9" event={"ID":"9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc","Type":"ContainerDied","Data":"9db5363bd6e9c912583788be4196d29f56894bfed0c8937d099a2322778e753d"} Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.954855 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e380-account-create-update-9mlv9" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.955494 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9db5363bd6e9c912583788be4196d29f56894bfed0c8937d099a2322778e753d" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.962303 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h24x7" event={"ID":"f19fe4f0-4b51-440d-82ad-28541b098fc4","Type":"ContainerDied","Data":"e296ec6b9eb50f0f62b3998f21aff1fb38ef791836f65e3c2fabbc4943a11582"} Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.962349 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e296ec6b9eb50f0f62b3998f21aff1fb38ef791836f65e3c2fabbc4943a11582" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.962911 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h24x7" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.965659 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-4eea-account-create-update-lxf2n" event={"ID":"bf1d16ab-bd85-4b4b-8862-19f134432523","Type":"ContainerDied","Data":"9193544e601958e179a92e801e9e87a8b4933b6fdee942b4555283766e602a05"} Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.965730 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9193544e601958e179a92e801e9e87a8b4933b6fdee942b4555283766e602a05" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.965840 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-4eea-account-create-update-lxf2n" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.969145 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-sc6nk" event={"ID":"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2","Type":"ContainerStarted","Data":"482f2cd5e4eeff084f774916496418bf7d97bc2732fa016bab2f67e9a1e087d5"} Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.974601 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-tq5tc" event={"ID":"d9b50998-7271-4d76-bf90-7a23ab8ae295","Type":"ContainerDied","Data":"711cc039f843e77241791e6addc80220f37d1a0754d82616a887b84b66939943"} Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.974636 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="711cc039f843e77241791e6addc80220f37d1a0754d82616a887b84b66939943" Mar 20 17:34:16 crc kubenswrapper[4803]: I0320 17:34:16.974726 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-tq5tc" Mar 20 17:34:17 crc kubenswrapper[4803]: I0320 17:34:17.003011 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-sc6nk" podStartSLOduration=2.343507609 podStartE2EDuration="8.002992596s" podCreationTimestamp="2026-03-20 17:34:09 +0000 UTC" firstStartedPulling="2026-03-20 17:34:10.487743435 +0000 UTC m=+1060.399335505" lastFinishedPulling="2026-03-20 17:34:16.147228422 +0000 UTC m=+1066.058820492" observedRunningTime="2026-03-20 17:34:16.992739909 +0000 UTC m=+1066.904331989" watchObservedRunningTime="2026-03-20 17:34:17.002992596 +0000 UTC m=+1066.914584676" Mar 20 17:34:20 crc kubenswrapper[4803]: I0320 17:34:20.002798 4803 generic.go:334] "Generic (PLEG): container finished" podID="07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2" containerID="482f2cd5e4eeff084f774916496418bf7d97bc2732fa016bab2f67e9a1e087d5" exitCode=0 Mar 20 17:34:20 crc kubenswrapper[4803]: I0320 17:34:20.002876 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-sc6nk" event={"ID":"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2","Type":"ContainerDied","Data":"482f2cd5e4eeff084f774916496418bf7d97bc2732fa016bab2f67e9a1e087d5"} Mar 20 17:34:20 crc kubenswrapper[4803]: I0320 17:34:20.599802 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:20 crc kubenswrapper[4803]: I0320 17:34:20.766937 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lpnmp"] Mar 20 17:34:20 crc kubenswrapper[4803]: I0320 17:34:20.767186 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" podUID="87faf158-6b0e-491b-bb0e-0d9a7497290a" containerName="dnsmasq-dns" containerID="cri-o://013e259e785c8f3437ac6a9e1c3951e0053a1751848d23395a70804341d59b57" gracePeriod=10 Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.015790 4803 generic.go:334] "Generic (PLEG): container finished" podID="87faf158-6b0e-491b-bb0e-0d9a7497290a" containerID="013e259e785c8f3437ac6a9e1c3951e0053a1751848d23395a70804341d59b57" exitCode=0 Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.016000 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" event={"ID":"87faf158-6b0e-491b-bb0e-0d9a7497290a","Type":"ContainerDied","Data":"013e259e785c8f3437ac6a9e1c3951e0053a1751848d23395a70804341d59b57"} Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.288592 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.331414 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-sb\") pod \"87faf158-6b0e-491b-bb0e-0d9a7497290a\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.331512 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-nb\") pod \"87faf158-6b0e-491b-bb0e-0d9a7497290a\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.331559 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgp7q\" (UniqueName: \"kubernetes.io/projected/87faf158-6b0e-491b-bb0e-0d9a7497290a-kube-api-access-tgp7q\") pod \"87faf158-6b0e-491b-bb0e-0d9a7497290a\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.331582 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-dns-svc\") pod \"87faf158-6b0e-491b-bb0e-0d9a7497290a\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.331598 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-config\") pod \"87faf158-6b0e-491b-bb0e-0d9a7497290a\" (UID: \"87faf158-6b0e-491b-bb0e-0d9a7497290a\") " Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.338193 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87faf158-6b0e-491b-bb0e-0d9a7497290a-kube-api-access-tgp7q" (OuterVolumeSpecName: "kube-api-access-tgp7q") pod "87faf158-6b0e-491b-bb0e-0d9a7497290a" (UID: "87faf158-6b0e-491b-bb0e-0d9a7497290a"). InnerVolumeSpecName "kube-api-access-tgp7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.370871 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-config" (OuterVolumeSpecName: "config") pod "87faf158-6b0e-491b-bb0e-0d9a7497290a" (UID: "87faf158-6b0e-491b-bb0e-0d9a7497290a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.375634 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87faf158-6b0e-491b-bb0e-0d9a7497290a" (UID: "87faf158-6b0e-491b-bb0e-0d9a7497290a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.381414 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87faf158-6b0e-491b-bb0e-0d9a7497290a" (UID: "87faf158-6b0e-491b-bb0e-0d9a7497290a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.381469 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87faf158-6b0e-491b-bb0e-0d9a7497290a" (UID: "87faf158-6b0e-491b-bb0e-0d9a7497290a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.385429 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.432665 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.432695 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgp7q\" (UniqueName: \"kubernetes.io/projected/87faf158-6b0e-491b-bb0e-0d9a7497290a-kube-api-access-tgp7q\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.432707 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.432715 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.432724 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87faf158-6b0e-491b-bb0e-0d9a7497290a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.533237 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-config-data\") pod \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.533346 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-combined-ca-bundle\") pod \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.533460 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7plh2\" (UniqueName: \"kubernetes.io/projected/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-kube-api-access-7plh2\") pod \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\" (UID: \"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2\") " Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.536913 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-kube-api-access-7plh2" (OuterVolumeSpecName: "kube-api-access-7plh2") pod "07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2" (UID: "07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2"). InnerVolumeSpecName "kube-api-access-7plh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.555096 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2" (UID: "07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.602679 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-config-data" (OuterVolumeSpecName: "config-data") pod "07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2" (UID: "07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.635026 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7plh2\" (UniqueName: \"kubernetes.io/projected/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-kube-api-access-7plh2\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.635262 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:21 crc kubenswrapper[4803]: I0320 17:34:21.635367 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.023429 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-sc6nk" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.023416 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-sc6nk" event={"ID":"07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2","Type":"ContainerDied","Data":"dcf9755f34949fae1c1bd7fd169dfa11ec5caa41b1da16c8ebd0413d81188d96"} Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.023562 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcf9755f34949fae1c1bd7fd169dfa11ec5caa41b1da16c8ebd0413d81188d96" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.026140 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" event={"ID":"87faf158-6b0e-491b-bb0e-0d9a7497290a","Type":"ContainerDied","Data":"6c9e6c800d8c53eaba15f25c29689660612862844be41b67ea73b8e7a6256a73"} Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.026181 4803 scope.go:117] "RemoveContainer" containerID="013e259e785c8f3437ac6a9e1c3951e0053a1751848d23395a70804341d59b57" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.026214 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-lpnmp" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.055284 4803 scope.go:117] "RemoveContainer" containerID="bb401b89fe039681af9e43d77d3b549292fabea4c919713f56aa391f3bb575b0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.077378 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lpnmp"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.084043 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-lpnmp"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311194 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qr6cw"] Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311789 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf1d16ab-bd85-4b4b-8862-19f134432523" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311805 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1d16ab-bd85-4b4b-8862-19f134432523" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311817 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87faf158-6b0e-491b-bb0e-0d9a7497290a" containerName="dnsmasq-dns" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311825 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="87faf158-6b0e-491b-bb0e-0d9a7497290a" containerName="dnsmasq-dns" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311838 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2" containerName="keystone-db-sync" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311844 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2" containerName="keystone-db-sync" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311855 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87faf158-6b0e-491b-bb0e-0d9a7497290a" containerName="init" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311860 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="87faf158-6b0e-491b-bb0e-0d9a7497290a" containerName="init" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311871 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311877 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311891 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c26921-27aa-4a0b-a98c-a6cf7478920e" containerName="dnsmasq-dns" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311897 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c26921-27aa-4a0b-a98c-a6cf7478920e" containerName="dnsmasq-dns" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311911 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b50998-7271-4d76-bf90-7a23ab8ae295" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311919 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b50998-7271-4d76-bf90-7a23ab8ae295" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311930 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea23ab0-9514-4256-aa1b-d477da1a19fe" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311936 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea23ab0-9514-4256-aa1b-d477da1a19fe" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311948 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f19fe4f0-4b51-440d-82ad-28541b098fc4" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311955 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f19fe4f0-4b51-440d-82ad-28541b098fc4" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311961 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb3f389-31c1-4ada-838c-ccef884cc082" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311967 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb3f389-31c1-4ada-838c-ccef884cc082" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: E0320 17:34:22.311976 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c26921-27aa-4a0b-a98c-a6cf7478920e" containerName="init" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.311982 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c26921-27aa-4a0b-a98c-a6cf7478920e" containerName="init" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312117 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea23ab0-9514-4256-aa1b-d477da1a19fe" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312128 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2" containerName="keystone-db-sync" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312137 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f19fe4f0-4b51-440d-82ad-28541b098fc4" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312145 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf1d16ab-bd85-4b4b-8862-19f134432523" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312154 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312164 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b50998-7271-4d76-bf90-7a23ab8ae295" containerName="mariadb-database-create" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312174 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb3f389-31c1-4ada-838c-ccef884cc082" containerName="mariadb-account-create-update" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312184 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c26921-27aa-4a0b-a98c-a6cf7478920e" containerName="dnsmasq-dns" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312192 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="87faf158-6b0e-491b-bb0e-0d9a7497290a" containerName="dnsmasq-dns" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.312657 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.316922 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.317257 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.317576 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q8lcw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.317723 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.317865 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.331690 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qr6cw"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.357543 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-bnm99"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.358908 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.393067 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-bnm99"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448447 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtd5h\" (UniqueName: \"kubernetes.io/projected/9856f8cd-d4f9-4d58-9493-29b8072aa143-kube-api-access-wtd5h\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448523 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448559 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448578 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb4wc\" (UniqueName: \"kubernetes.io/projected/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-kube-api-access-hb4wc\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448594 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-config\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448613 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-config-data\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-scripts\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448656 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448687 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-combined-ca-bundle\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448704 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-fernet-keys\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448726 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.448748 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-credential-keys\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.475105 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75c7bb9db9-q6pzq"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.480297 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.489515 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.494771 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-v92kw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.494929 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.495045 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.505565 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75c7bb9db9-q6pzq"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.549430 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfdlf\" (UniqueName: \"kubernetes.io/projected/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-kube-api-access-dfdlf\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.549664 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-horizon-secret-key\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.549751 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.549813 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.549872 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb4wc\" (UniqueName: \"kubernetes.io/projected/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-kube-api-access-hb4wc\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.549952 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-config\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550023 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-config-data\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550083 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-logs\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550157 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-scripts\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550238 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550327 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-combined-ca-bundle\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550391 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-fernet-keys\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550450 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-config-data\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550515 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550592 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-scripts\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550664 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-credential-keys\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550727 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtd5h\" (UniqueName: \"kubernetes.io/projected/9856f8cd-d4f9-4d58-9493-29b8072aa143-kube-api-access-wtd5h\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.550963 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.551655 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.551899 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-config\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.554550 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.555207 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.559615 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-config-data\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.566711 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-combined-ca-bundle\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.571492 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-fernet-keys\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.573642 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-scripts\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.577698 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb4wc\" (UniqueName: \"kubernetes.io/projected/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-kube-api-access-hb4wc\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.577890 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtd5h\" (UniqueName: \"kubernetes.io/projected/9856f8cd-d4f9-4d58-9493-29b8072aa143-kube-api-access-wtd5h\") pod \"dnsmasq-dns-bbf5cc879-bnm99\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.581990 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-credential-keys\") pod \"keystone-bootstrap-qr6cw\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.643260 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dtdjf"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.644211 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.652846 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-config-data\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.653046 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-config-data\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.653120 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-scripts\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.653252 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc45n\" (UniqueName: \"kubernetes.io/projected/56c1156a-5e7a-4547-8a7b-46a55651b7a7-kube-api-access-rc45n\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.653347 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56c1156a-5e7a-4547-8a7b-46a55651b7a7-etc-machine-id\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.653415 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfdlf\" (UniqueName: \"kubernetes.io/projected/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-kube-api-access-dfdlf\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.661813 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-horizon-secret-key\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.661943 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-combined-ca-bundle\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.662008 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-scripts\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.662126 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-logs\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.662197 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-db-sync-config-data\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.656425 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-scripts\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.656948 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-config-data\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.656715 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.663700 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bbgpq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.664097 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-logs\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.664567 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.664949 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.679666 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-xm2fq"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.715279 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.718441 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfdlf\" (UniqueName: \"kubernetes.io/projected/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-kube-api-access-dfdlf\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.725520 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-horizon-secret-key\") pod \"horizon-75c7bb9db9-q6pzq\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.733259 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.750501 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2fq6h" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.754578 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.767381 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dtdjf"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.779909 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc45n\" (UniqueName: \"kubernetes.io/projected/56c1156a-5e7a-4547-8a7b-46a55651b7a7-kube-api-access-rc45n\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.779985 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-db-sync-config-data\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.780080 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56c1156a-5e7a-4547-8a7b-46a55651b7a7-etc-machine-id\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.780165 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8bvt\" (UniqueName: \"kubernetes.io/projected/1f3de51a-19ff-4714-b839-921efeeb3e48-kube-api-access-v8bvt\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.780201 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-combined-ca-bundle\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.780223 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-scripts\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.780361 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-db-sync-config-data\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.780483 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-combined-ca-bundle\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.780610 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-config-data\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.789047 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56c1156a-5e7a-4547-8a7b-46a55651b7a7-etc-machine-id\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.792552 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xm2fq"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.810130 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-db-sync-config-data\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.817800 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-scripts\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.817819 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-combined-ca-bundle\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.818217 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc45n\" (UniqueName: \"kubernetes.io/projected/56c1156a-5e7a-4547-8a7b-46a55651b7a7-kube-api-access-rc45n\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.819682 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.820016 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-config-data\") pod \"cinder-db-sync-dtdjf\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.835888 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.853309 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.857980 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6nn79" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.858231 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.863071 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.863663 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884654 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884725 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884747 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884768 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884809 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-db-sync-config-data\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884828 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884850 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884882 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8bvt\" (UniqueName: \"kubernetes.io/projected/1f3de51a-19ff-4714-b839-921efeeb3e48-kube-api-access-v8bvt\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.884898 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.885110 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhvp2\" (UniqueName: \"kubernetes.io/projected/5867f49b-235e-4934-9273-941b5e4c2d3c-kube-api-access-rhvp2\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.885168 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-combined-ca-bundle\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.892273 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-combined-ca-bundle\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.897763 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-db-sync-config-data\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.898855 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87faf158-6b0e-491b-bb0e-0d9a7497290a" path="/var/lib/kubelet/pods/87faf158-6b0e-491b-bb0e-0d9a7497290a/volumes" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.915633 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8bvt\" (UniqueName: \"kubernetes.io/projected/1f3de51a-19ff-4714-b839-921efeeb3e48-kube-api-access-v8bvt\") pod \"barbican-db-sync-xm2fq\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.940432 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6f59b89f4f-pqnkx"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.942697 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.969998 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-h45zt"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.971612 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.975302 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.975574 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mf4z4" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.975685 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.986094 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.986128 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.986150 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.986167 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.986200 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.986219 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.986248 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.986288 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhvp2\" (UniqueName: \"kubernetes.io/projected/5867f49b-235e-4934-9273-941b5e4c2d3c-kube-api-access-rhvp2\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.988659 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-logs\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.989173 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.993615 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.993672 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.996993 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:34:22 crc kubenswrapper[4803]: I0320 17:34:22.998860 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.000034 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.008259 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.009916 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.013584 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.014259 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.026101 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhvp2\" (UniqueName: \"kubernetes.io/projected/5867f49b-235e-4934-9273-941b5e4c2d3c-kube-api-access-rhvp2\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.026642 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f59b89f4f-pqnkx"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.030128 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.033914 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.075342 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.080174 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.088835 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g96dv\" (UniqueName: \"kubernetes.io/projected/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-kube-api-access-g96dv\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.088911 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.088989 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-combined-ca-bundle\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089063 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-run-httpd\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089138 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-scripts\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089177 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-horizon-secret-key\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089249 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-config-data\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089294 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089321 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8z77\" (UniqueName: \"kubernetes.io/projected/1ff4df91-5788-4dc9-a817-6c6a41bb955c-kube-api-access-j8z77\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089482 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-config-data\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089617 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-log-httpd\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089708 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-logs\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089796 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9whv\" (UniqueName: \"kubernetes.io/projected/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-kube-api-access-j9whv\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.089888 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-scripts\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.090127 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-config\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.108857 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.113852 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-bnm99"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.129395 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-h45zt"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.162075 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.163447 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.165910 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.166264 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.171575 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7dldg"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.173271 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192384 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-combined-ca-bundle\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192473 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-run-httpd\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192519 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-scripts\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192597 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-horizon-secret-key\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192619 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-config-data\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192638 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192658 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8z77\" (UniqueName: \"kubernetes.io/projected/1ff4df91-5788-4dc9-a817-6c6a41bb955c-kube-api-access-j8z77\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192702 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-config-data\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192737 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-log-httpd\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192770 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-logs\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192808 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9whv\" (UniqueName: \"kubernetes.io/projected/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-kube-api-access-j9whv\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192840 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-scripts\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192858 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-config\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192906 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.192932 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g96dv\" (UniqueName: \"kubernetes.io/projected/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-kube-api-access-g96dv\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.194501 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-log-httpd\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.198052 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-75rtg"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.199167 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.203170 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pxj2c" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.203321 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.205600 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-75rtg"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.207395 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-logs\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.210551 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.218614 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-horizon-secret-key\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.224461 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9whv\" (UniqueName: \"kubernetes.io/projected/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-kube-api-access-j9whv\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.228339 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7dldg"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.232009 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.259648 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.264304 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-run-httpd\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.266890 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-scripts\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.267888 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-config-data\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.272143 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.273163 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-combined-ca-bundle\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.274258 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-config\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.275752 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g96dv\" (UniqueName: \"kubernetes.io/projected/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-kube-api-access-g96dv\") pod \"horizon-6f59b89f4f-pqnkx\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.277064 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8z77\" (UniqueName: \"kubernetes.io/projected/1ff4df91-5788-4dc9-a817-6c6a41bb955c-kube-api-access-j8z77\") pod \"neutron-db-sync-h45zt\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.280719 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-scripts\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.285764 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-config-data\") pod \"ceilometer-0\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.292394 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d002f5-379f-4709-a3df-aeb253a8884b-logs\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307710 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307750 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-logs\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307799 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307833 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307873 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-config\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307896 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c94gp\" (UniqueName: \"kubernetes.io/projected/e6d002f5-379f-4709-a3df-aeb253a8884b-kube-api-access-c94gp\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307924 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307960 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.307979 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-scripts\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308022 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-combined-ca-bundle\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308058 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69wjk\" (UniqueName: \"kubernetes.io/projected/e163614d-669d-44cf-93bd-6e6107dcf86e-kube-api-access-69wjk\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308099 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-scripts\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308129 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308150 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308169 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-config-data\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308199 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tms2s\" (UniqueName: \"kubernetes.io/projected/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-kube-api-access-tms2s\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308219 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.308247 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-config-data\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.385207 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409480 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-combined-ca-bundle\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409541 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69wjk\" (UniqueName: \"kubernetes.io/projected/e163614d-669d-44cf-93bd-6e6107dcf86e-kube-api-access-69wjk\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409565 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-scripts\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409593 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409609 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409626 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-config-data\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409642 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tms2s\" (UniqueName: \"kubernetes.io/projected/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-kube-api-access-tms2s\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409662 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409709 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-config-data\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409741 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d002f5-379f-4709-a3df-aeb253a8884b-logs\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409763 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409784 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-logs\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409809 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409833 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409865 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-config\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409890 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c94gp\" (UniqueName: \"kubernetes.io/projected/e6d002f5-379f-4709-a3df-aeb253a8884b-kube-api-access-c94gp\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409916 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409946 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.409962 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-scripts\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.410571 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.414332 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-config\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.414401 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.414704 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-logs\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.416617 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-combined-ca-bundle\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.417285 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.417685 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.418005 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-scripts\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.418240 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.418260 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.418395 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d002f5-379f-4709-a3df-aeb253a8884b-logs\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.418391 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.418789 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.419753 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-scripts\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.421240 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-config-data\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.439149 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-config-data\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.440727 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69wjk\" (UniqueName: \"kubernetes.io/projected/e163614d-669d-44cf-93bd-6e6107dcf86e-kube-api-access-69wjk\") pod \"dnsmasq-dns-56df8fb6b7-7dldg\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.442429 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c94gp\" (UniqueName: \"kubernetes.io/projected/e6d002f5-379f-4709-a3df-aeb253a8884b-kube-api-access-c94gp\") pod \"placement-db-sync-75rtg\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.450057 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tms2s\" (UniqueName: \"kubernetes.io/projected/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-kube-api-access-tms2s\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.474974 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.485444 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.498303 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.534749 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.560753 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.581432 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-75rtg" Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.593259 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qr6cw"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.616843 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-bnm99"] Mar 20 17:34:23 crc kubenswrapper[4803]: W0320 17:34:23.633600 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb42b6cef_c9e1_4bff_98e5_44d5d8f98985.slice/crio-b9dde87972feb9edaecf9ada460908997e3f0772ce0accc4c873b9b9311b870b WatchSource:0}: Error finding container b9dde87972feb9edaecf9ada460908997e3f0772ce0accc4c873b9b9311b870b: Status 404 returned error can't find the container with id b9dde87972feb9edaecf9ada460908997e3f0772ce0accc4c873b9b9311b870b Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.794849 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75c7bb9db9-q6pzq"] Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.912459 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dtdjf"] Mar 20 17:34:23 crc kubenswrapper[4803]: W0320 17:34:23.963888 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56c1156a_5e7a_4547_8a7b_46a55651b7a7.slice/crio-da37800cc614d82ce9be4b4f78ec36278003861543cd93fa77273b13e36bc53e WatchSource:0}: Error finding container da37800cc614d82ce9be4b4f78ec36278003861543cd93fa77273b13e36bc53e: Status 404 returned error can't find the container with id da37800cc614d82ce9be4b4f78ec36278003861543cd93fa77273b13e36bc53e Mar 20 17:34:23 crc kubenswrapper[4803]: I0320 17:34:23.986275 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-xm2fq"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.064719 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c7bb9db9-q6pzq" event={"ID":"e6d4a2a3-e213-45a5-b167-bbf8217eeca6","Type":"ContainerStarted","Data":"ec342f2729e5dc4e014353244370d0c70b9b7ce613648af415198d81df354907"} Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.065908 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xm2fq" event={"ID":"1f3de51a-19ff-4714-b839-921efeeb3e48","Type":"ContainerStarted","Data":"66a5aa89991d90aa6401b0c301dcaf63ad66c3e099846be3479f5af663593e04"} Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.067271 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" event={"ID":"9856f8cd-d4f9-4d58-9493-29b8072aa143","Type":"ContainerStarted","Data":"208133770b7463b9e9461e3cbf1d2d5aeaa8f254770e4c09b2ba8c4b4a0ba553"} Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.069035 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dtdjf" event={"ID":"56c1156a-5e7a-4547-8a7b-46a55651b7a7","Type":"ContainerStarted","Data":"da37800cc614d82ce9be4b4f78ec36278003861543cd93fa77273b13e36bc53e"} Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.075990 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qr6cw" event={"ID":"b42b6cef-c9e1-4bff-98e5-44d5d8f98985","Type":"ContainerStarted","Data":"b9dde87972feb9edaecf9ada460908997e3f0772ce0accc4c873b9b9311b870b"} Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.253515 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.311769 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6f59b89f4f-pqnkx"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.433340 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-h45zt"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.477810 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.551371 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f59b89f4f-pqnkx"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.576277 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-59f5cc869c-j8h6v"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.577719 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.584549 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.602225 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-75rtg"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.614685 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.632619 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.639509 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-config-data\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.639785 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-scripts\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.639915 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3e2f8c-2803-4b30-8921-9f8815fe8211-logs\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.640002 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crt7v\" (UniqueName: \"kubernetes.io/projected/ff3e2f8c-2803-4b30-8921-9f8815fe8211-kube-api-access-crt7v\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.640133 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff3e2f8c-2803-4b30-8921-9f8815fe8211-horizon-secret-key\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: W0320 17:34:24.658033 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d002f5_379f_4709_a3df_aeb253a8884b.slice/crio-408aa3be400ceeca85699119448e316dabeb5a85fb997e96d16381cbc6033444 WatchSource:0}: Error finding container 408aa3be400ceeca85699119448e316dabeb5a85fb997e96d16381cbc6033444: Status 404 returned error can't find the container with id 408aa3be400ceeca85699119448e316dabeb5a85fb997e96d16381cbc6033444 Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.698601 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.707546 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59f5cc869c-j8h6v"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.716877 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7dldg"] Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.741330 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-scripts\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.741466 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3e2f8c-2803-4b30-8921-9f8815fe8211-logs\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.741624 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crt7v\" (UniqueName: \"kubernetes.io/projected/ff3e2f8c-2803-4b30-8921-9f8815fe8211-kube-api-access-crt7v\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.741755 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff3e2f8c-2803-4b30-8921-9f8815fe8211-horizon-secret-key\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.741862 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-config-data\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.742181 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3e2f8c-2803-4b30-8921-9f8815fe8211-logs\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.742270 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-scripts\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.743678 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-config-data\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.751151 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff3e2f8c-2803-4b30-8921-9f8815fe8211-horizon-secret-key\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.770045 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crt7v\" (UniqueName: \"kubernetes.io/projected/ff3e2f8c-2803-4b30-8921-9f8815fe8211-kube-api-access-crt7v\") pod \"horizon-59f5cc869c-j8h6v\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:24 crc kubenswrapper[4803]: I0320 17:34:24.926871 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.152815 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40","Type":"ContainerStarted","Data":"d0cf40118b694a5b1c4dae13e3ac02b7e1fb3f036a87b457101598dc5c0f053f"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.195910 4803 generic.go:334] "Generic (PLEG): container finished" podID="9856f8cd-d4f9-4d58-9493-29b8072aa143" containerID="c7150a67a85bff828de1302d3fb5abc1f14e0458ec8a5ff5b41055597c810f12" exitCode=0 Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.196110 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" event={"ID":"9856f8cd-d4f9-4d58-9493-29b8072aa143","Type":"ContainerDied","Data":"c7150a67a85bff828de1302d3fb5abc1f14e0458ec8a5ff5b41055597c810f12"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.199394 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-75rtg" event={"ID":"e6d002f5-379f-4709-a3df-aeb253a8884b","Type":"ContainerStarted","Data":"408aa3be400ceeca85699119448e316dabeb5a85fb997e96d16381cbc6033444"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.201231 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h45zt" event={"ID":"1ff4df91-5788-4dc9-a817-6c6a41bb955c","Type":"ContainerStarted","Data":"05bc60563eba9143e7f0fa1da073b73155603ad561d734a6ab388763c0f62456"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.201298 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h45zt" event={"ID":"1ff4df91-5788-4dc9-a817-6c6a41bb955c","Type":"ContainerStarted","Data":"c5c47eb934256f5ffa9407838ae196680549d84b1b2ebad75bdec24535884464"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.205990 4803 generic.go:334] "Generic (PLEG): container finished" podID="e163614d-669d-44cf-93bd-6e6107dcf86e" containerID="1e928c9d8c104b8f2d47f898ce22e1603870f5b9fa5b61353dfe75654422527d" exitCode=0 Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.206104 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" event={"ID":"e163614d-669d-44cf-93bd-6e6107dcf86e","Type":"ContainerDied","Data":"1e928c9d8c104b8f2d47f898ce22e1603870f5b9fa5b61353dfe75654422527d"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.206124 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" event={"ID":"e163614d-669d-44cf-93bd-6e6107dcf86e","Type":"ContainerStarted","Data":"b9435c8155ba4d34e276c3846071d571227d59578e317f2b65f6b71403ad86f6"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.208369 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qr6cw" event={"ID":"b42b6cef-c9e1-4bff-98e5-44d5d8f98985","Type":"ContainerStarted","Data":"90edf5a9dbd8c7f4398a1cedcc13d4bf83cd86e752363131cdb6db3760423011"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.215737 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5867f49b-235e-4934-9273-941b5e4c2d3c","Type":"ContainerStarted","Data":"e85aaf4a1821a6dcf06f630c3a73048eef445dcf86aedeeb759cc12ea790e89f"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.219937 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f59b89f4f-pqnkx" event={"ID":"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5","Type":"ContainerStarted","Data":"89a80936bfd1c99e82ff3db8836a4e9fa75654ff48c2e7d04f1883166daa085a"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.223035 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86282f54-b60e-4398-9ed5-0aa6b33d2a1f","Type":"ContainerStarted","Data":"4604053a06afdcc9810bae801d060d29d26914fe90cc8fedd7e000be4d51854f"} Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.245951 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qr6cw" podStartSLOduration=3.245933333 podStartE2EDuration="3.245933333s" podCreationTimestamp="2026-03-20 17:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:25.238835351 +0000 UTC m=+1075.150427421" watchObservedRunningTime="2026-03-20 17:34:25.245933333 +0000 UTC m=+1075.157525423" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.266649 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-h45zt" podStartSLOduration=3.266635102 podStartE2EDuration="3.266635102s" podCreationTimestamp="2026-03-20 17:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:25.253315682 +0000 UTC m=+1075.164907762" watchObservedRunningTime="2026-03-20 17:34:25.266635102 +0000 UTC m=+1075.178227172" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.539355 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-59f5cc869c-j8h6v"] Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.674398 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.761036 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtd5h\" (UniqueName: \"kubernetes.io/projected/9856f8cd-d4f9-4d58-9493-29b8072aa143-kube-api-access-wtd5h\") pod \"9856f8cd-d4f9-4d58-9493-29b8072aa143\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.761098 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-sb\") pod \"9856f8cd-d4f9-4d58-9493-29b8072aa143\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.761148 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-nb\") pod \"9856f8cd-d4f9-4d58-9493-29b8072aa143\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.761206 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-swift-storage-0\") pod \"9856f8cd-d4f9-4d58-9493-29b8072aa143\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.761237 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-svc\") pod \"9856f8cd-d4f9-4d58-9493-29b8072aa143\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.761269 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-config\") pod \"9856f8cd-d4f9-4d58-9493-29b8072aa143\" (UID: \"9856f8cd-d4f9-4d58-9493-29b8072aa143\") " Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.770955 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9856f8cd-d4f9-4d58-9493-29b8072aa143-kube-api-access-wtd5h" (OuterVolumeSpecName: "kube-api-access-wtd5h") pod "9856f8cd-d4f9-4d58-9493-29b8072aa143" (UID: "9856f8cd-d4f9-4d58-9493-29b8072aa143"). InnerVolumeSpecName "kube-api-access-wtd5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.787091 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9856f8cd-d4f9-4d58-9493-29b8072aa143" (UID: "9856f8cd-d4f9-4d58-9493-29b8072aa143"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.792364 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9856f8cd-d4f9-4d58-9493-29b8072aa143" (UID: "9856f8cd-d4f9-4d58-9493-29b8072aa143"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.793880 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-config" (OuterVolumeSpecName: "config") pod "9856f8cd-d4f9-4d58-9493-29b8072aa143" (UID: "9856f8cd-d4f9-4d58-9493-29b8072aa143"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.800508 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9856f8cd-d4f9-4d58-9493-29b8072aa143" (UID: "9856f8cd-d4f9-4d58-9493-29b8072aa143"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.802757 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9856f8cd-d4f9-4d58-9493-29b8072aa143" (UID: "9856f8cd-d4f9-4d58-9493-29b8072aa143"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.864132 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.864164 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.864172 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.864181 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.864189 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtd5h\" (UniqueName: \"kubernetes.io/projected/9856f8cd-d4f9-4d58-9493-29b8072aa143-kube-api-access-wtd5h\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:25 crc kubenswrapper[4803]: I0320 17:34:25.864199 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9856f8cd-d4f9-4d58-9493-29b8072aa143-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.251137 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" event={"ID":"9856f8cd-d4f9-4d58-9493-29b8072aa143","Type":"ContainerDied","Data":"208133770b7463b9e9461e3cbf1d2d5aeaa8f254770e4c09b2ba8c4b4a0ba553"} Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.251187 4803 scope.go:117] "RemoveContainer" containerID="c7150a67a85bff828de1302d3fb5abc1f14e0458ec8a5ff5b41055597c810f12" Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.251287 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-bnm99" Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.277991 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59f5cc869c-j8h6v" event={"ID":"ff3e2f8c-2803-4b30-8921-9f8815fe8211","Type":"ContainerStarted","Data":"d69dc7b41f2a25217c010ec1299698924e850db235b3cf91381081d53cef00a8"} Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.323577 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" event={"ID":"e163614d-669d-44cf-93bd-6e6107dcf86e","Type":"ContainerStarted","Data":"ad44d3471a9d026f800a7817071234aecf7531430d1475413ebefb3994ffda17"} Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.323632 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.333397 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5867f49b-235e-4934-9273-941b5e4c2d3c","Type":"ContainerStarted","Data":"09cac58cd958e133c9a655de86de8efc43701a85de59da52412acf0a0cbe9737"} Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.356157 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" podStartSLOduration=4.356139886 podStartE2EDuration="4.356139886s" podCreationTimestamp="2026-03-20 17:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:26.349948889 +0000 UTC m=+1076.261540969" watchObservedRunningTime="2026-03-20 17:34:26.356139886 +0000 UTC m=+1076.267731956" Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.403574 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-bnm99"] Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.410133 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-bnm99"] Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.417763 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86282f54-b60e-4398-9ed5-0aa6b33d2a1f","Type":"ContainerStarted","Data":"be841f7c7e70a6df27fe2bf4b9fc8959399675ee60188ea94463b63879318d5b"} Mar 20 17:34:26 crc kubenswrapper[4803]: I0320 17:34:26.872164 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9856f8cd-d4f9-4d58-9493-29b8072aa143" path="/var/lib/kubelet/pods/9856f8cd-d4f9-4d58-9493-29b8072aa143/volumes" Mar 20 17:34:27 crc kubenswrapper[4803]: I0320 17:34:27.434873 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5867f49b-235e-4934-9273-941b5e4c2d3c","Type":"ContainerStarted","Data":"639aa3d849cbe91340be307b259cb1b71e21916bef5f27f8b1c4fc4a19dd4ecb"} Mar 20 17:34:27 crc kubenswrapper[4803]: I0320 17:34:27.435217 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerName="glance-log" containerID="cri-o://09cac58cd958e133c9a655de86de8efc43701a85de59da52412acf0a0cbe9737" gracePeriod=30 Mar 20 17:34:27 crc kubenswrapper[4803]: I0320 17:34:27.435355 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerName="glance-httpd" containerID="cri-o://639aa3d849cbe91340be307b259cb1b71e21916bef5f27f8b1c4fc4a19dd4ecb" gracePeriod=30 Mar 20 17:34:27 crc kubenswrapper[4803]: I0320 17:34:27.446300 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86282f54-b60e-4398-9ed5-0aa6b33d2a1f","Type":"ContainerStarted","Data":"c4cda99442dba1443a977674c39c658ac332e0afa8b2c8f4ff2e1cd0ef849e93"} Mar 20 17:34:27 crc kubenswrapper[4803]: I0320 17:34:27.446424 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerName="glance-log" containerID="cri-o://be841f7c7e70a6df27fe2bf4b9fc8959399675ee60188ea94463b63879318d5b" gracePeriod=30 Mar 20 17:34:27 crc kubenswrapper[4803]: I0320 17:34:27.446630 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerName="glance-httpd" containerID="cri-o://c4cda99442dba1443a977674c39c658ac332e0afa8b2c8f4ff2e1cd0ef849e93" gracePeriod=30 Mar 20 17:34:27 crc kubenswrapper[4803]: I0320 17:34:27.482279 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.482237208 podStartE2EDuration="5.482237208s" podCreationTimestamp="2026-03-20 17:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:27.467185881 +0000 UTC m=+1077.378777961" watchObservedRunningTime="2026-03-20 17:34:27.482237208 +0000 UTC m=+1077.393829278" Mar 20 17:34:27 crc kubenswrapper[4803]: I0320 17:34:27.486457 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.486438311 podStartE2EDuration="5.486438311s" podCreationTimestamp="2026-03-20 17:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:27.48600342 +0000 UTC m=+1077.397595490" watchObservedRunningTime="2026-03-20 17:34:27.486438311 +0000 UTC m=+1077.398030371" Mar 20 17:34:28 crc kubenswrapper[4803]: I0320 17:34:28.462052 4803 generic.go:334] "Generic (PLEG): container finished" podID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerID="c4cda99442dba1443a977674c39c658ac332e0afa8b2c8f4ff2e1cd0ef849e93" exitCode=0 Mar 20 17:34:28 crc kubenswrapper[4803]: I0320 17:34:28.462080 4803 generic.go:334] "Generic (PLEG): container finished" podID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerID="be841f7c7e70a6df27fe2bf4b9fc8959399675ee60188ea94463b63879318d5b" exitCode=143 Mar 20 17:34:28 crc kubenswrapper[4803]: I0320 17:34:28.462133 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86282f54-b60e-4398-9ed5-0aa6b33d2a1f","Type":"ContainerDied","Data":"c4cda99442dba1443a977674c39c658ac332e0afa8b2c8f4ff2e1cd0ef849e93"} Mar 20 17:34:28 crc kubenswrapper[4803]: I0320 17:34:28.462175 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86282f54-b60e-4398-9ed5-0aa6b33d2a1f","Type":"ContainerDied","Data":"be841f7c7e70a6df27fe2bf4b9fc8959399675ee60188ea94463b63879318d5b"} Mar 20 17:34:28 crc kubenswrapper[4803]: I0320 17:34:28.465967 4803 generic.go:334] "Generic (PLEG): container finished" podID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerID="639aa3d849cbe91340be307b259cb1b71e21916bef5f27f8b1c4fc4a19dd4ecb" exitCode=0 Mar 20 17:34:28 crc kubenswrapper[4803]: I0320 17:34:28.465991 4803 generic.go:334] "Generic (PLEG): container finished" podID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerID="09cac58cd958e133c9a655de86de8efc43701a85de59da52412acf0a0cbe9737" exitCode=143 Mar 20 17:34:28 crc kubenswrapper[4803]: I0320 17:34:28.466013 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5867f49b-235e-4934-9273-941b5e4c2d3c","Type":"ContainerDied","Data":"639aa3d849cbe91340be307b259cb1b71e21916bef5f27f8b1c4fc4a19dd4ecb"} Mar 20 17:34:28 crc kubenswrapper[4803]: I0320 17:34:28.466037 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5867f49b-235e-4934-9273-941b5e4c2d3c","Type":"ContainerDied","Data":"09cac58cd958e133c9a655de86de8efc43701a85de59da52412acf0a0cbe9737"} Mar 20 17:34:29 crc kubenswrapper[4803]: I0320 17:34:29.477828 4803 generic.go:334] "Generic (PLEG): container finished" podID="b42b6cef-c9e1-4bff-98e5-44d5d8f98985" containerID="90edf5a9dbd8c7f4398a1cedcc13d4bf83cd86e752363131cdb6db3760423011" exitCode=0 Mar 20 17:34:29 crc kubenswrapper[4803]: I0320 17:34:29.477886 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qr6cw" event={"ID":"b42b6cef-c9e1-4bff-98e5-44d5d8f98985","Type":"ContainerDied","Data":"90edf5a9dbd8c7f4398a1cedcc13d4bf83cd86e752363131cdb6db3760423011"} Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.290188 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75c7bb9db9-q6pzq"] Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.336294 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-75b65b9966-sh4pn"] Mar 20 17:34:31 crc kubenswrapper[4803]: E0320 17:34:31.338487 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9856f8cd-d4f9-4d58-9493-29b8072aa143" containerName="init" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.338510 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="9856f8cd-d4f9-4d58-9493-29b8072aa143" containerName="init" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.338718 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="9856f8cd-d4f9-4d58-9493-29b8072aa143" containerName="init" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.339535 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.341042 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.349550 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75b65b9966-sh4pn"] Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.367192 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59f5cc869c-j8h6v"] Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.383177 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-logs\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.383225 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-tls-certs\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.383258 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwckr\" (UniqueName: \"kubernetes.io/projected/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-kube-api-access-fwckr\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.383294 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-combined-ca-bundle\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.383314 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-config-data\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.383357 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-secret-key\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.383379 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-scripts\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.393727 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-596cfc5b56-w5pbk"] Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.394981 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.426645 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-596cfc5b56-w5pbk"] Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485080 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-logs\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485127 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-tls-certs\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485164 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwckr\" (UniqueName: \"kubernetes.io/projected/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-kube-api-access-fwckr\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485183 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-combined-ca-bundle\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485222 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-combined-ca-bundle\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485244 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-config-data\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485260 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-horizon-secret-key\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485280 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6cvh\" (UniqueName: \"kubernetes.io/projected/76a924c8-a380-4ee2-a6ce-ac77f0979f24-kube-api-access-s6cvh\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485307 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76a924c8-a380-4ee2-a6ce-ac77f0979f24-config-data\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485324 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-horizon-tls-certs\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485351 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-secret-key\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485375 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-scripts\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485414 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76a924c8-a380-4ee2-a6ce-ac77f0979f24-scripts\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.485435 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a924c8-a380-4ee2-a6ce-ac77f0979f24-logs\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.486079 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-logs\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.486215 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-scripts\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.486492 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-config-data\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.490850 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-combined-ca-bundle\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.492199 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-secret-key\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.493405 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-tls-certs\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.502403 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwckr\" (UniqueName: \"kubernetes.io/projected/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-kube-api-access-fwckr\") pod \"horizon-75b65b9966-sh4pn\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.587768 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-horizon-tls-certs\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.588510 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76a924c8-a380-4ee2-a6ce-ac77f0979f24-scripts\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.588620 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a924c8-a380-4ee2-a6ce-ac77f0979f24-logs\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.588762 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-combined-ca-bundle\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.588862 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-horizon-secret-key\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.588927 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6cvh\" (UniqueName: \"kubernetes.io/projected/76a924c8-a380-4ee2-a6ce-ac77f0979f24-kube-api-access-s6cvh\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.589002 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76a924c8-a380-4ee2-a6ce-ac77f0979f24-config-data\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.590057 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76a924c8-a380-4ee2-a6ce-ac77f0979f24-config-data\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.590497 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76a924c8-a380-4ee2-a6ce-ac77f0979f24-logs\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.590645 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76a924c8-a380-4ee2-a6ce-ac77f0979f24-scripts\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.593282 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-horizon-tls-certs\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.595808 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-horizon-secret-key\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.595843 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a924c8-a380-4ee2-a6ce-ac77f0979f24-combined-ca-bundle\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.606604 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6cvh\" (UniqueName: \"kubernetes.io/projected/76a924c8-a380-4ee2-a6ce-ac77f0979f24-kube-api-access-s6cvh\") pod \"horizon-596cfc5b56-w5pbk\" (UID: \"76a924c8-a380-4ee2-a6ce-ac77f0979f24\") " pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.661301 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:34:31 crc kubenswrapper[4803]: I0320 17:34:31.722997 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:34:33 crc kubenswrapper[4803]: I0320 17:34:33.562914 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:34:33 crc kubenswrapper[4803]: I0320 17:34:33.675988 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wgnp4"] Mar 20 17:34:33 crc kubenswrapper[4803]: I0320 17:34:33.676261 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="dnsmasq-dns" containerID="cri-o://ba659fc0cdd28b23c62059f98ea06997fdfba92de1c809f239614a6726b8efd7" gracePeriod=10 Mar 20 17:34:34 crc kubenswrapper[4803]: I0320 17:34:34.527834 4803 generic.go:334] "Generic (PLEG): container finished" podID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerID="ba659fc0cdd28b23c62059f98ea06997fdfba92de1c809f239614a6726b8efd7" exitCode=0 Mar 20 17:34:34 crc kubenswrapper[4803]: I0320 17:34:34.527914 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" event={"ID":"b6450f12-4337-43e5-b4e8-817e6b9a8d8f","Type":"ContainerDied","Data":"ba659fc0cdd28b23c62059f98ea06997fdfba92de1c809f239614a6726b8efd7"} Mar 20 17:34:35 crc kubenswrapper[4803]: I0320 17:34:35.598939 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 20 17:34:40 crc kubenswrapper[4803]: I0320 17:34:40.599295 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.109081 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.129544 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.220718 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-combined-ca-bundle\") pod \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.220786 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-public-tls-certs\") pod \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.220856 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-config-data\") pod \"5867f49b-235e-4934-9273-941b5e4c2d3c\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.220919 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-logs\") pod \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.220941 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-httpd-run\") pod \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.220971 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhvp2\" (UniqueName: \"kubernetes.io/projected/5867f49b-235e-4934-9273-941b5e4c2d3c-kube-api-access-rhvp2\") pod \"5867f49b-235e-4934-9273-941b5e4c2d3c\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221005 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-combined-ca-bundle\") pod \"5867f49b-235e-4934-9273-941b5e4c2d3c\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221031 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-scripts\") pod \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221079 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"5867f49b-235e-4934-9273-941b5e4c2d3c\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221097 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tms2s\" (UniqueName: \"kubernetes.io/projected/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-kube-api-access-tms2s\") pod \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221134 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221150 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-logs\") pod \"5867f49b-235e-4934-9273-941b5e4c2d3c\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221170 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-config-data\") pod \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\" (UID: \"86282f54-b60e-4398-9ed5-0aa6b33d2a1f\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221190 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-internal-tls-certs\") pod \"5867f49b-235e-4934-9273-941b5e4c2d3c\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221208 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-httpd-run\") pod \"5867f49b-235e-4934-9273-941b5e4c2d3c\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.221242 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-scripts\") pod \"5867f49b-235e-4934-9273-941b5e4c2d3c\" (UID: \"5867f49b-235e-4934-9273-941b5e4c2d3c\") " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.222159 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "86282f54-b60e-4398-9ed5-0aa6b33d2a1f" (UID: "86282f54-b60e-4398-9ed5-0aa6b33d2a1f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.222222 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-logs" (OuterVolumeSpecName: "logs") pod "5867f49b-235e-4934-9273-941b5e4c2d3c" (UID: "5867f49b-235e-4934-9273-941b5e4c2d3c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.222424 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-logs" (OuterVolumeSpecName: "logs") pod "86282f54-b60e-4398-9ed5-0aa6b33d2a1f" (UID: "86282f54-b60e-4398-9ed5-0aa6b33d2a1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.229124 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5867f49b-235e-4934-9273-941b5e4c2d3c-kube-api-access-rhvp2" (OuterVolumeSpecName: "kube-api-access-rhvp2") pod "5867f49b-235e-4934-9273-941b5e4c2d3c" (UID: "5867f49b-235e-4934-9273-941b5e4c2d3c"). InnerVolumeSpecName "kube-api-access-rhvp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.229184 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-kube-api-access-tms2s" (OuterVolumeSpecName: "kube-api-access-tms2s") pod "86282f54-b60e-4398-9ed5-0aa6b33d2a1f" (UID: "86282f54-b60e-4398-9ed5-0aa6b33d2a1f"). InnerVolumeSpecName "kube-api-access-tms2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.229400 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5867f49b-235e-4934-9273-941b5e4c2d3c" (UID: "5867f49b-235e-4934-9273-941b5e4c2d3c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.229685 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-scripts" (OuterVolumeSpecName: "scripts") pod "5867f49b-235e-4934-9273-941b5e4c2d3c" (UID: "5867f49b-235e-4934-9273-941b5e4c2d3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.229846 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "5867f49b-235e-4934-9273-941b5e4c2d3c" (UID: "5867f49b-235e-4934-9273-941b5e4c2d3c"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.231383 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-scripts" (OuterVolumeSpecName: "scripts") pod "86282f54-b60e-4398-9ed5-0aa6b33d2a1f" (UID: "86282f54-b60e-4398-9ed5-0aa6b33d2a1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.245117 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "86282f54-b60e-4398-9ed5-0aa6b33d2a1f" (UID: "86282f54-b60e-4398-9ed5-0aa6b33d2a1f"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.252730 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5867f49b-235e-4934-9273-941b5e4c2d3c" (UID: "5867f49b-235e-4934-9273-941b5e4c2d3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.272307 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86282f54-b60e-4398-9ed5-0aa6b33d2a1f" (UID: "86282f54-b60e-4398-9ed5-0aa6b33d2a1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.278831 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-config-data" (OuterVolumeSpecName: "config-data") pod "5867f49b-235e-4934-9273-941b5e4c2d3c" (UID: "5867f49b-235e-4934-9273-941b5e4c2d3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.280830 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5867f49b-235e-4934-9273-941b5e4c2d3c" (UID: "5867f49b-235e-4934-9273-941b5e4c2d3c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.281310 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-config-data" (OuterVolumeSpecName: "config-data") pod "86282f54-b60e-4398-9ed5-0aa6b33d2a1f" (UID: "86282f54-b60e-4398-9ed5-0aa6b33d2a1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.286150 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "86282f54-b60e-4398-9ed5-0aa6b33d2a1f" (UID: "86282f54-b60e-4398-9ed5-0aa6b33d2a1f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322716 4803 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322753 4803 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322762 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322771 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322780 4803 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322788 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322796 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322804 4803 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322812 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhvp2\" (UniqueName: \"kubernetes.io/projected/5867f49b-235e-4934-9273-941b5e4c2d3c-kube-api-access-rhvp2\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322822 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5867f49b-235e-4934-9273-941b5e4c2d3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322830 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322862 4803 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322874 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tms2s\" (UniqueName: \"kubernetes.io/projected/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-kube-api-access-tms2s\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322889 4803 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322901 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5867f49b-235e-4934-9273-941b5e4c2d3c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.322912 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86282f54-b60e-4398-9ed5-0aa6b33d2a1f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.343396 4803 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.345624 4803 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.424551 4803 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.424588 4803 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.608867 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.608859 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5867f49b-235e-4934-9273-941b5e4c2d3c","Type":"ContainerDied","Data":"e85aaf4a1821a6dcf06f630c3a73048eef445dcf86aedeeb759cc12ea790e89f"} Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.608956 4803 scope.go:117] "RemoveContainer" containerID="639aa3d849cbe91340be307b259cb1b71e21916bef5f27f8b1c4fc4a19dd4ecb" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.611906 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86282f54-b60e-4398-9ed5-0aa6b33d2a1f","Type":"ContainerDied","Data":"4604053a06afdcc9810bae801d060d29d26914fe90cc8fedd7e000be4d51854f"} Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.611991 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.656960 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.665194 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.681673 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.688490 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.697588 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:43 crc kubenswrapper[4803]: E0320 17:34:43.698109 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerName="glance-log" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.698125 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerName="glance-log" Mar 20 17:34:43 crc kubenswrapper[4803]: E0320 17:34:43.698140 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerName="glance-httpd" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.698147 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerName="glance-httpd" Mar 20 17:34:43 crc kubenswrapper[4803]: E0320 17:34:43.698154 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerName="glance-httpd" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.698159 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerName="glance-httpd" Mar 20 17:34:43 crc kubenswrapper[4803]: E0320 17:34:43.698171 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerName="glance-log" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.698176 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerName="glance-log" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.698340 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerName="glance-httpd" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.698354 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" containerName="glance-log" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.698365 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerName="glance-log" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.698378 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" containerName="glance-httpd" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.699241 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.703721 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.704150 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.705465 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.704135 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.704271 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6nn79" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.704506 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.711732 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.714191 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.743959 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.787036 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835253 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-logs\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835292 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835313 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835331 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t772\" (UniqueName: \"kubernetes.io/projected/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-kube-api-access-8t772\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835384 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835428 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzpk9\" (UniqueName: \"kubernetes.io/projected/824aff30-6d5e-4489-bf27-79910aafe31e-kube-api-access-zzpk9\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835451 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835478 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835499 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835540 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835561 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835587 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835601 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-scripts\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835620 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835635 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-config-data\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.835663 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937202 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937268 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937299 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937318 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-scripts\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937348 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937373 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-config-data\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937406 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937432 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937455 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-logs\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937485 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937508 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937545 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t772\" (UniqueName: \"kubernetes.io/projected/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-kube-api-access-8t772\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937587 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937614 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzpk9\" (UniqueName: \"kubernetes.io/projected/824aff30-6d5e-4489-bf27-79910aafe31e-kube-api-access-zzpk9\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937640 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937680 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.937734 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.938138 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.938513 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-logs\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.938886 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.939178 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-logs\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.939449 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.950657 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.950784 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.951177 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.951423 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.956747 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-scripts\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.959230 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.960613 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-config-data\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.960681 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.967257 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t772\" (UniqueName: \"kubernetes.io/projected/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-kube-api-access-8t772\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:43 crc kubenswrapper[4803]: I0320 17:34:43.982480 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzpk9\" (UniqueName: \"kubernetes.io/projected/824aff30-6d5e-4489-bf27-79910aafe31e-kube-api-access-zzpk9\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:44 crc kubenswrapper[4803]: I0320 17:34:44.003731 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:34:44 crc kubenswrapper[4803]: I0320 17:34:44.050765 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " pod="openstack/glance-default-external-api-0" Mar 20 17:34:44 crc kubenswrapper[4803]: I0320 17:34:44.054166 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:34:44 crc kubenswrapper[4803]: I0320 17:34:44.068935 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:34:44 crc kubenswrapper[4803]: I0320 17:34:44.856413 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5867f49b-235e-4934-9273-941b5e4c2d3c" path="/var/lib/kubelet/pods/5867f49b-235e-4934-9273-941b5e4c2d3c/volumes" Mar 20 17:34:44 crc kubenswrapper[4803]: I0320 17:34:44.857214 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86282f54-b60e-4398-9ed5-0aa6b33d2a1f" path="/var/lib/kubelet/pods/86282f54-b60e-4398-9ed5-0aa6b33d2a1f/volumes" Mar 20 17:34:45 crc kubenswrapper[4803]: I0320 17:34:45.599259 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 20 17:34:45 crc kubenswrapper[4803]: I0320 17:34:45.599746 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.206406 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.206805 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n554h68chfh78h665h9dh5hd8h558h568hc9h5b6h74h76hc7h644hbfh7h665h54hf7h56fhdbhc7h58dh5fbh5cbh586h67dh5bdhb7h87q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g96dv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6f59b89f4f-pqnkx_openstack(5567f335-1a97-4f2d-9cf0-da1e1fcab1c5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.208981 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6f59b89f4f-pqnkx" podUID="5567f335-1a97-4f2d-9cf0-da1e1fcab1c5" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.225440 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.225584 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfbh94hd6h66fh667h645hb6h6dh5b5h58bhfdhfbh85h548h567h6chd6h5f5h5c9h55dh65fhc6hcdh58hcdh5d7h68dh68bh54h5c7h5bh587q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-crt7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-59f5cc869c-j8h6v_openstack(ff3e2f8c-2803-4b30-8921-9f8815fe8211): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.227769 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-59f5cc869c-j8h6v" podUID="ff3e2f8c-2803-4b30-8921-9f8815fe8211" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.250734 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.250905 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n59h5fdh547h88h699h596h6h554hc4h579h65fhbdh5f8h7chc7h88h64dh66fh57fh8ch679h65bh5cfh67ch587h585hc6h57bh7dh64bh68fh655q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dfdlf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-75c7bb9db9-q6pzq_openstack(e6d4a2a3-e213-45a5-b167-bbf8217eeca6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.253219 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-75c7bb9db9-q6pzq" podUID="e6d4a2a3-e213-45a5-b167-bbf8217eeca6" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.265872 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.313142 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-fernet-keys\") pod \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.313276 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-combined-ca-bundle\") pod \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.313340 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-credential-keys\") pod \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.313402 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-scripts\") pod \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.313437 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-config-data\") pod \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.313463 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb4wc\" (UniqueName: \"kubernetes.io/projected/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-kube-api-access-hb4wc\") pod \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\" (UID: \"b42b6cef-c9e1-4bff-98e5-44d5d8f98985\") " Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.318688 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-scripts" (OuterVolumeSpecName: "scripts") pod "b42b6cef-c9e1-4bff-98e5-44d5d8f98985" (UID: "b42b6cef-c9e1-4bff-98e5-44d5d8f98985"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.318820 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b42b6cef-c9e1-4bff-98e5-44d5d8f98985" (UID: "b42b6cef-c9e1-4bff-98e5-44d5d8f98985"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.328745 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-kube-api-access-hb4wc" (OuterVolumeSpecName: "kube-api-access-hb4wc") pod "b42b6cef-c9e1-4bff-98e5-44d5d8f98985" (UID: "b42b6cef-c9e1-4bff-98e5-44d5d8f98985"). InnerVolumeSpecName "kube-api-access-hb4wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.334025 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b42b6cef-c9e1-4bff-98e5-44d5d8f98985" (UID: "b42b6cef-c9e1-4bff-98e5-44d5d8f98985"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.340062 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b42b6cef-c9e1-4bff-98e5-44d5d8f98985" (UID: "b42b6cef-c9e1-4bff-98e5-44d5d8f98985"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.340090 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-config-data" (OuterVolumeSpecName: "config-data") pod "b42b6cef-c9e1-4bff-98e5-44d5d8f98985" (UID: "b42b6cef-c9e1-4bff-98e5-44d5d8f98985"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.415893 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.415991 4803 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.416002 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.416010 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.416018 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb4wc\" (UniqueName: \"kubernetes.io/projected/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-kube-api-access-hb4wc\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.416027 4803 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b42b6cef-c9e1-4bff-98e5-44d5d8f98985-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.592641 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 17:34:48 crc kubenswrapper[4803]: E0320 17:34:48.592821 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndbh597h6dh5f8hddhb4h59dh64ch684h5c4hcbh67h648h5cch77h67bhch5d4h79h58bh97h54dhcch55dh54dh5f9h58ch66h5bdh5bbh78h6bq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j9whv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.596154 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.665417 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qr6cw" Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.665412 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qr6cw" event={"ID":"b42b6cef-c9e1-4bff-98e5-44d5d8f98985","Type":"ContainerDied","Data":"b9dde87972feb9edaecf9ada460908997e3f0772ce0accc4c873b9b9311b870b"} Mar 20 17:34:48 crc kubenswrapper[4803]: I0320 17:34:48.665532 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9dde87972feb9edaecf9ada460908997e3f0772ce0accc4c873b9b9311b870b" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.352124 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qr6cw"] Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.360909 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qr6cw"] Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.455016 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xfvq7"] Mar 20 17:34:49 crc kubenswrapper[4803]: E0320 17:34:49.455503 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42b6cef-c9e1-4bff-98e5-44d5d8f98985" containerName="keystone-bootstrap" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.455548 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42b6cef-c9e1-4bff-98e5-44d5d8f98985" containerName="keystone-bootstrap" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.455823 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42b6cef-c9e1-4bff-98e5-44d5d8f98985" containerName="keystone-bootstrap" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.456540 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.459133 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.459678 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.459939 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.460539 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.460717 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q8lcw" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.470043 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xfvq7"] Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.548161 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-scripts\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.548219 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcbp2\" (UniqueName: \"kubernetes.io/projected/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-kube-api-access-dcbp2\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.548256 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-fernet-keys\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.548307 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-combined-ca-bundle\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.548335 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-credential-keys\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.548370 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-config-data\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.650095 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-combined-ca-bundle\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.650161 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-credential-keys\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.651174 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-config-data\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.651303 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-scripts\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.653806 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcbp2\" (UniqueName: \"kubernetes.io/projected/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-kube-api-access-dcbp2\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.653932 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-fernet-keys\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.657876 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-config-data\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.658312 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-scripts\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.660266 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-fernet-keys\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.668861 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-credential-keys\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.675141 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcbp2\" (UniqueName: \"kubernetes.io/projected/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-kube-api-access-dcbp2\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.677218 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-combined-ca-bundle\") pod \"keystone-bootstrap-xfvq7\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.678776 4803 generic.go:334] "Generic (PLEG): container finished" podID="1ff4df91-5788-4dc9-a817-6c6a41bb955c" containerID="05bc60563eba9143e7f0fa1da073b73155603ad561d734a6ab388763c0f62456" exitCode=0 Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.678827 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h45zt" event={"ID":"1ff4df91-5788-4dc9-a817-6c6a41bb955c","Type":"ContainerDied","Data":"05bc60563eba9143e7f0fa1da073b73155603ad561d734a6ab388763c0f62456"} Mar 20 17:34:49 crc kubenswrapper[4803]: I0320 17:34:49.773977 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:34:50 crc kubenswrapper[4803]: I0320 17:34:50.599251 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.139:5353: connect: connection refused" Mar 20 17:34:50 crc kubenswrapper[4803]: I0320 17:34:50.865650 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b42b6cef-c9e1-4bff-98e5-44d5d8f98985" path="/var/lib/kubelet/pods/b42b6cef-c9e1-4bff-98e5-44d5d8f98985/volumes" Mar 20 17:34:51 crc kubenswrapper[4803]: E0320 17:34:51.912934 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 17:34:51 crc kubenswrapper[4803]: E0320 17:34:51.913179 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rc45n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dtdjf_openstack(56c1156a-5e7a-4547-8a7b-46a55651b7a7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:34:51 crc kubenswrapper[4803]: E0320 17:34:51.914592 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dtdjf" podUID="56c1156a-5e7a-4547-8a7b-46a55651b7a7" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.014759 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.025428 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.028708 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.096752 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-scripts\") pod \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.096876 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-horizon-secret-key\") pod \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.096900 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfdlf\" (UniqueName: \"kubernetes.io/projected/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-kube-api-access-dfdlf\") pod \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.096933 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-scripts\") pod \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.096957 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-config-data\") pod \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097041 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g96dv\" (UniqueName: \"kubernetes.io/projected/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-kube-api-access-g96dv\") pod \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097070 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-logs\") pod \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097119 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-scripts\") pod \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097145 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff3e2f8c-2803-4b30-8921-9f8815fe8211-horizon-secret-key\") pod \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097181 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3e2f8c-2803-4b30-8921-9f8815fe8211-logs\") pod \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097212 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-logs\") pod \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097251 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crt7v\" (UniqueName: \"kubernetes.io/projected/ff3e2f8c-2803-4b30-8921-9f8815fe8211-kube-api-access-crt7v\") pod \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097276 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-horizon-secret-key\") pod \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\" (UID: \"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097301 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-config-data\") pod \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\" (UID: \"e6d4a2a3-e213-45a5-b167-bbf8217eeca6\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097326 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-config-data\") pod \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\" (UID: \"ff3e2f8c-2803-4b30-8921-9f8815fe8211\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.097955 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-logs" (OuterVolumeSpecName: "logs") pod "e6d4a2a3-e213-45a5-b167-bbf8217eeca6" (UID: "e6d4a2a3-e213-45a5-b167-bbf8217eeca6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.098507 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-config-data" (OuterVolumeSpecName: "config-data") pod "ff3e2f8c-2803-4b30-8921-9f8815fe8211" (UID: "ff3e2f8c-2803-4b30-8921-9f8815fe8211"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.098667 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-scripts" (OuterVolumeSpecName: "scripts") pod "ff3e2f8c-2803-4b30-8921-9f8815fe8211" (UID: "ff3e2f8c-2803-4b30-8921-9f8815fe8211"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.099259 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-scripts" (OuterVolumeSpecName: "scripts") pod "e6d4a2a3-e213-45a5-b167-bbf8217eeca6" (UID: "e6d4a2a3-e213-45a5-b167-bbf8217eeca6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.099349 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-scripts" (OuterVolumeSpecName: "scripts") pod "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5" (UID: "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.099487 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff3e2f8c-2803-4b30-8921-9f8815fe8211-logs" (OuterVolumeSpecName: "logs") pod "ff3e2f8c-2803-4b30-8921-9f8815fe8211" (UID: "ff3e2f8c-2803-4b30-8921-9f8815fe8211"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.100227 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-config-data" (OuterVolumeSpecName: "config-data") pod "e6d4a2a3-e213-45a5-b167-bbf8217eeca6" (UID: "e6d4a2a3-e213-45a5-b167-bbf8217eeca6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.102955 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-config-data" (OuterVolumeSpecName: "config-data") pod "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5" (UID: "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.109760 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-logs" (OuterVolumeSpecName: "logs") pod "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5" (UID: "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.110470 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5" (UID: "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.111942 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff3e2f8c-2803-4b30-8921-9f8815fe8211-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ff3e2f8c-2803-4b30-8921-9f8815fe8211" (UID: "ff3e2f8c-2803-4b30-8921-9f8815fe8211"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.112750 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff3e2f8c-2803-4b30-8921-9f8815fe8211-kube-api-access-crt7v" (OuterVolumeSpecName: "kube-api-access-crt7v") pod "ff3e2f8c-2803-4b30-8921-9f8815fe8211" (UID: "ff3e2f8c-2803-4b30-8921-9f8815fe8211"). InnerVolumeSpecName "kube-api-access-crt7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.112985 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e6d4a2a3-e213-45a5-b167-bbf8217eeca6" (UID: "e6d4a2a3-e213-45a5-b167-bbf8217eeca6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.114661 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-kube-api-access-g96dv" (OuterVolumeSpecName: "kube-api-access-g96dv") pod "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5" (UID: "5567f335-1a97-4f2d-9cf0-da1e1fcab1c5"). InnerVolumeSpecName "kube-api-access-g96dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.119959 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-kube-api-access-dfdlf" (OuterVolumeSpecName: "kube-api-access-dfdlf") pod "e6d4a2a3-e213-45a5-b167-bbf8217eeca6" (UID: "e6d4a2a3-e213-45a5-b167-bbf8217eeca6"). InnerVolumeSpecName "kube-api-access-dfdlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199784 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199815 4803 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ff3e2f8c-2803-4b30-8921-9f8815fe8211-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199827 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff3e2f8c-2803-4b30-8921-9f8815fe8211-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199841 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199852 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crt7v\" (UniqueName: \"kubernetes.io/projected/ff3e2f8c-2803-4b30-8921-9f8815fe8211-kube-api-access-crt7v\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199864 4803 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199876 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199886 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199897 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff3e2f8c-2803-4b30-8921-9f8815fe8211-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199911 4803 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199921 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfdlf\" (UniqueName: \"kubernetes.io/projected/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-kube-api-access-dfdlf\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199930 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199938 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199947 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g96dv\" (UniqueName: \"kubernetes.io/projected/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5-kube-api-access-g96dv\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.199956 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d4a2a3-e213-45a5-b167-bbf8217eeca6-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: E0320 17:34:52.636084 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 17:34:52 crc kubenswrapper[4803]: E0320 17:34:52.636332 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8bvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-xm2fq_openstack(1f3de51a-19ff-4714-b839-921efeeb3e48): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 17:34:52 crc kubenswrapper[4803]: E0320 17:34:52.637501 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-xm2fq" podUID="1f3de51a-19ff-4714-b839-921efeeb3e48" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.659759 4803 scope.go:117] "RemoveContainer" containerID="09cac58cd958e133c9a655de86de8efc43701a85de59da52412acf0a0cbe9737" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.684688 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.714282 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8z77\" (UniqueName: \"kubernetes.io/projected/1ff4df91-5788-4dc9-a817-6c6a41bb955c-kube-api-access-j8z77\") pod \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.715164 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-combined-ca-bundle\") pod \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.715254 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-config\") pod \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\" (UID: \"1ff4df91-5788-4dc9-a817-6c6a41bb955c\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.715915 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.722196 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ff4df91-5788-4dc9-a817-6c6a41bb955c-kube-api-access-j8z77" (OuterVolumeSpecName: "kube-api-access-j8z77") pod "1ff4df91-5788-4dc9-a817-6c6a41bb955c" (UID: "1ff4df91-5788-4dc9-a817-6c6a41bb955c"). InnerVolumeSpecName "kube-api-access-j8z77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.742817 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6f59b89f4f-pqnkx" event={"ID":"5567f335-1a97-4f2d-9cf0-da1e1fcab1c5","Type":"ContainerDied","Data":"89a80936bfd1c99e82ff3db8836a4e9fa75654ff48c2e7d04f1883166daa085a"} Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.742915 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6f59b89f4f-pqnkx" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.745494 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" event={"ID":"b6450f12-4337-43e5-b4e8-817e6b9a8d8f","Type":"ContainerDied","Data":"1a69aa12ee69141d06d700f260cc940ca9dba6792d61d0eae8c70d054661a5e6"} Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.745507 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-wgnp4" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.746842 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75c7bb9db9-q6pzq" event={"ID":"e6d4a2a3-e213-45a5-b167-bbf8217eeca6","Type":"ContainerDied","Data":"ec342f2729e5dc4e014353244370d0c70b9b7ce613648af415198d81df354907"} Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.746899 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75c7bb9db9-q6pzq" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.773792 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-h45zt" event={"ID":"1ff4df91-5788-4dc9-a817-6c6a41bb955c","Type":"ContainerDied","Data":"c5c47eb934256f5ffa9407838ae196680549d84b1b2ebad75bdec24535884464"} Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.773831 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5c47eb934256f5ffa9407838ae196680549d84b1b2ebad75bdec24535884464" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.773903 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-h45zt" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.777588 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-config" (OuterVolumeSpecName: "config") pod "1ff4df91-5788-4dc9-a817-6c6a41bb955c" (UID: "1ff4df91-5788-4dc9-a817-6c6a41bb955c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.782899 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ff4df91-5788-4dc9-a817-6c6a41bb955c" (UID: "1ff4df91-5788-4dc9-a817-6c6a41bb955c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.783482 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-59f5cc869c-j8h6v" event={"ID":"ff3e2f8c-2803-4b30-8921-9f8815fe8211","Type":"ContainerDied","Data":"d69dc7b41f2a25217c010ec1299698924e850db235b3cf91381081d53cef00a8"} Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.783491 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-59f5cc869c-j8h6v" Mar 20 17:34:52 crc kubenswrapper[4803]: E0320 17:34:52.792975 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-xm2fq" podUID="1f3de51a-19ff-4714-b839-921efeeb3e48" Mar 20 17:34:52 crc kubenswrapper[4803]: E0320 17:34:52.793276 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-dtdjf" podUID="56c1156a-5e7a-4547-8a7b-46a55651b7a7" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.822635 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-sb\") pod \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.823178 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-swift-storage-0\") pod \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.823405 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdkml\" (UniqueName: \"kubernetes.io/projected/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-kube-api-access-xdkml\") pod \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.823503 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-config\") pod \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.823557 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-svc\") pod \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.823613 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-nb\") pod \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\" (UID: \"b6450f12-4337-43e5-b4e8-817e6b9a8d8f\") " Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.825641 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.825671 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ff4df91-5788-4dc9-a817-6c6a41bb955c-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.825685 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8z77\" (UniqueName: \"kubernetes.io/projected/1ff4df91-5788-4dc9-a817-6c6a41bb955c-kube-api-access-j8z77\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.831665 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-kube-api-access-xdkml" (OuterVolumeSpecName: "kube-api-access-xdkml") pod "b6450f12-4337-43e5-b4e8-817e6b9a8d8f" (UID: "b6450f12-4337-43e5-b4e8-817e6b9a8d8f"). InnerVolumeSpecName "kube-api-access-xdkml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.880399 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6f59b89f4f-pqnkx"] Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.880434 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6f59b89f4f-pqnkx"] Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.915387 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75c7bb9db9-q6pzq"] Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.928623 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdkml\" (UniqueName: \"kubernetes.io/projected/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-kube-api-access-xdkml\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.929106 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-config" (OuterVolumeSpecName: "config") pod "b6450f12-4337-43e5-b4e8-817e6b9a8d8f" (UID: "b6450f12-4337-43e5-b4e8-817e6b9a8d8f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.936785 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75c7bb9db9-q6pzq"] Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.942926 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6450f12-4337-43e5-b4e8-817e6b9a8d8f" (UID: "b6450f12-4337-43e5-b4e8-817e6b9a8d8f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.946313 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6450f12-4337-43e5-b4e8-817e6b9a8d8f" (UID: "b6450f12-4337-43e5-b4e8-817e6b9a8d8f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.952160 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6450f12-4337-43e5-b4e8-817e6b9a8d8f" (UID: "b6450f12-4337-43e5-b4e8-817e6b9a8d8f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.961597 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6450f12-4337-43e5-b4e8-817e6b9a8d8f" (UID: "b6450f12-4337-43e5-b4e8-817e6b9a8d8f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.974893 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-59f5cc869c-j8h6v"] Mar 20 17:34:52 crc kubenswrapper[4803]: I0320 17:34:52.983141 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-59f5cc869c-j8h6v"] Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.029824 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.029854 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.029864 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.029872 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.029881 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6450f12-4337-43e5-b4e8-817e6b9a8d8f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.076212 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wgnp4"] Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.084639 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-wgnp4"] Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.198293 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-75b65b9966-sh4pn"] Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.225772 4803 scope.go:117] "RemoveContainer" containerID="c4cda99442dba1443a977674c39c658ac332e0afa8b2c8f4ff2e1cd0ef849e93" Mar 20 17:34:53 crc kubenswrapper[4803]: W0320 17:34:53.263260 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fa5f0f8_cc68_400d_8570_df60f8d8c7ed.slice/crio-71b1be04cf2cb2930232b5e2dab15265d7f68dc9d126f6320eb3853e9ba1b642 WatchSource:0}: Error finding container 71b1be04cf2cb2930232b5e2dab15265d7f68dc9d126f6320eb3853e9ba1b642: Status 404 returned error can't find the container with id 71b1be04cf2cb2930232b5e2dab15265d7f68dc9d126f6320eb3853e9ba1b642 Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.292380 4803 scope.go:117] "RemoveContainer" containerID="be841f7c7e70a6df27fe2bf4b9fc8959399675ee60188ea94463b63879318d5b" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.321812 4803 scope.go:117] "RemoveContainer" containerID="ba659fc0cdd28b23c62059f98ea06997fdfba92de1c809f239614a6726b8efd7" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.346577 4803 scope.go:117] "RemoveContainer" containerID="86006e5fe143a864b511d8aa7f54dd3bd571cffd52237d95957d27f821034591" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.576155 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-596cfc5b56-w5pbk"] Mar 20 17:34:53 crc kubenswrapper[4803]: W0320 17:34:53.579283 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76a924c8_a380_4ee2_a6ce_ac77f0979f24.slice/crio-e15e8c061d51fe9ac5f12c51967c4bbce709f785f4e606cedcf70e8959615f9a WatchSource:0}: Error finding container e15e8c061d51fe9ac5f12c51967c4bbce709f785f4e606cedcf70e8959615f9a: Status 404 returned error can't find the container with id e15e8c061d51fe9ac5f12c51967c4bbce709f785f4e606cedcf70e8959615f9a Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.689514 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xfvq7"] Mar 20 17:34:53 crc kubenswrapper[4803]: W0320 17:34:53.698664 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8d182f2_4d18_46d2_b9a2_349bf6ddb311.slice/crio-9c869c5a74774d40ef3c3c707ee0e15c1098e7ba37c84b9d414ad238e63301f8 WatchSource:0}: Error finding container 9c869c5a74774d40ef3c3c707ee0e15c1098e7ba37c84b9d414ad238e63301f8: Status 404 returned error can't find the container with id 9c869c5a74774d40ef3c3c707ee0e15c1098e7ba37c84b9d414ad238e63301f8 Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.802671 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xfvq7" event={"ID":"a8d182f2-4d18-46d2-b9a2-349bf6ddb311","Type":"ContainerStarted","Data":"9c869c5a74774d40ef3c3c707ee0e15c1098e7ba37c84b9d414ad238e63301f8"} Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.810774 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b65b9966-sh4pn" event={"ID":"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed","Type":"ContainerStarted","Data":"71b1be04cf2cb2930232b5e2dab15265d7f68dc9d126f6320eb3853e9ba1b642"} Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.817810 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40","Type":"ContainerStarted","Data":"ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb"} Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.819473 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cfc5b56-w5pbk" event={"ID":"76a924c8-a380-4ee2-a6ce-ac77f0979f24","Type":"ContainerStarted","Data":"e15e8c061d51fe9ac5f12c51967c4bbce709f785f4e606cedcf70e8959615f9a"} Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.837080 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-75rtg" event={"ID":"e6d002f5-379f-4709-a3df-aeb253a8884b","Type":"ContainerStarted","Data":"fe13e8955f038320902fb18b2c56d5aa4a644899a2f52e28c950ffdc85579fe7"} Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.886461 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-mh5hz"] Mar 20 17:34:53 crc kubenswrapper[4803]: E0320 17:34:53.886873 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="dnsmasq-dns" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.886886 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="dnsmasq-dns" Mar 20 17:34:53 crc kubenswrapper[4803]: E0320 17:34:53.886904 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ff4df91-5788-4dc9-a817-6c6a41bb955c" containerName="neutron-db-sync" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.886910 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ff4df91-5788-4dc9-a817-6c6a41bb955c" containerName="neutron-db-sync" Mar 20 17:34:53 crc kubenswrapper[4803]: E0320 17:34:53.886929 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="init" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.886934 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="init" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.887101 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ff4df91-5788-4dc9-a817-6c6a41bb955c" containerName="neutron-db-sync" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.887119 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" containerName="dnsmasq-dns" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.887965 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.903680 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-mh5hz"] Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.911535 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-75rtg" podStartSLOduration=4.6789016409999995 podStartE2EDuration="31.911496907s" podCreationTimestamp="2026-03-20 17:34:22 +0000 UTC" firstStartedPulling="2026-03-20 17:34:24.669671844 +0000 UTC m=+1074.581263914" lastFinishedPulling="2026-03-20 17:34:51.90226707 +0000 UTC m=+1101.813859180" observedRunningTime="2026-03-20 17:34:53.873186331 +0000 UTC m=+1103.784778401" watchObservedRunningTime="2026-03-20 17:34:53.911496907 +0000 UTC m=+1103.823088977" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.951485 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.951618 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-config\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.951700 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.951805 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.951840 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-svc\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:53 crc kubenswrapper[4803]: I0320 17:34:53.951889 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7b88\" (UniqueName: \"kubernetes.io/projected/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-kube-api-access-r7b88\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.047581 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.055793 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.057261 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-config\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.057208 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.058162 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-config\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.058180 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.058223 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.058371 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.058409 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-svc\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.058475 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7b88\" (UniqueName: \"kubernetes.io/projected/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-kube-api-access-r7b88\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.058967 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.059441 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-svc\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.077250 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7b88\" (UniqueName: \"kubernetes.io/projected/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-kube-api-access-r7b88\") pod \"dnsmasq-dns-6b7b667979-mh5hz\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.091928 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-799cb79944-p24nt"] Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.093233 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.098840 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.099104 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.099211 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.099885 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mf4z4" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.108243 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-799cb79944-p24nt"] Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.160583 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-combined-ca-bundle\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.160629 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-config\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.160687 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-ovndb-tls-certs\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.160740 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-httpd-config\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.160760 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx7f7\" (UniqueName: \"kubernetes.io/projected/ce4ab454-b5e3-435e-b406-ee5891a82b69-kube-api-access-jx7f7\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.270710 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-combined-ca-bundle\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.270774 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-config\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.270910 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-ovndb-tls-certs\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.271024 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-httpd-config\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.271039 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx7f7\" (UniqueName: \"kubernetes.io/projected/ce4ab454-b5e3-435e-b406-ee5891a82b69-kube-api-access-jx7f7\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.275270 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-combined-ca-bundle\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.275299 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-ovndb-tls-certs\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.276261 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-httpd-config\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.286228 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-config\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.294118 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx7f7\" (UniqueName: \"kubernetes.io/projected/ce4ab454-b5e3-435e-b406-ee5891a82b69-kube-api-access-jx7f7\") pod \"neutron-799cb79944-p24nt\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.325855 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.542972 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.879818 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5567f335-1a97-4f2d-9cf0-da1e1fcab1c5" path="/var/lib/kubelet/pods/5567f335-1a97-4f2d-9cf0-da1e1fcab1c5/volumes" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.880512 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6450f12-4337-43e5-b4e8-817e6b9a8d8f" path="/var/lib/kubelet/pods/b6450f12-4337-43e5-b4e8-817e6b9a8d8f/volumes" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.881293 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d4a2a3-e213-45a5-b167-bbf8217eeca6" path="/var/lib/kubelet/pods/e6d4a2a3-e213-45a5-b167-bbf8217eeca6/volumes" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.881776 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff3e2f8c-2803-4b30-8921-9f8815fe8211" path="/var/lib/kubelet/pods/ff3e2f8c-2803-4b30-8921-9f8815fe8211/volumes" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.882700 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b65b9966-sh4pn" event={"ID":"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed","Type":"ContainerStarted","Data":"5f8eee39cb1e2bae324df0b81bafc95c7eb474ef6d9cd9db257fe0d718615d13"} Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.882740 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b65b9966-sh4pn" event={"ID":"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed","Type":"ContainerStarted","Data":"118e9705aaa8a134f7b51ee635781e210861eb775bae53298f2aeeb57f80314c"} Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.895099 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cfc5b56-w5pbk" event={"ID":"76a924c8-a380-4ee2-a6ce-ac77f0979f24","Type":"ContainerStarted","Data":"695df10fa05660ce3bf8a733b9ba5192f9c39372fe83a4abffcfb4499f3842c6"} Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.895140 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cfc5b56-w5pbk" event={"ID":"76a924c8-a380-4ee2-a6ce-ac77f0979f24","Type":"ContainerStarted","Data":"52edb3e6115e056d4c4444a644f1646250ef5061ed2e5cdb508dbcb9b8f55aa2"} Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.908089 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"824aff30-6d5e-4489-bf27-79910aafe31e","Type":"ContainerStarted","Data":"f0240cdb4bc1cd10ecc35ba3f331154b3247f37c1243547d2fa84fce4534a75a"} Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.908126 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"824aff30-6d5e-4489-bf27-79910aafe31e","Type":"ContainerStarted","Data":"cb5be6fb0d30073e29afd2569eac9ec86bda75041effcb358329f79dbd107482"} Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.910960 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.923468 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-75b65b9966-sh4pn" podStartSLOduration=23.387879235 podStartE2EDuration="23.923449634s" podCreationTimestamp="2026-03-20 17:34:31 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.29499893 +0000 UTC m=+1103.206591000" lastFinishedPulling="2026-03-20 17:34:53.830569329 +0000 UTC m=+1103.742161399" observedRunningTime="2026-03-20 17:34:54.905666223 +0000 UTC m=+1104.817258313" watchObservedRunningTime="2026-03-20 17:34:54.923449634 +0000 UTC m=+1104.835041704" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.942972 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-mh5hz"] Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.951646 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-596cfc5b56-w5pbk" podStartSLOduration=23.277745117 podStartE2EDuration="23.951625475s" podCreationTimestamp="2026-03-20 17:34:31 +0000 UTC" firstStartedPulling="2026-03-20 17:34:53.582678397 +0000 UTC m=+1103.494270477" lastFinishedPulling="2026-03-20 17:34:54.256558765 +0000 UTC m=+1104.168150835" observedRunningTime="2026-03-20 17:34:54.951374829 +0000 UTC m=+1104.862966899" watchObservedRunningTime="2026-03-20 17:34:54.951625475 +0000 UTC m=+1104.863217545" Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.968769 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xfvq7" event={"ID":"a8d182f2-4d18-46d2-b9a2-349bf6ddb311","Type":"ContainerStarted","Data":"ca6c4f87fdaed092282729f2a14553e6dc9d18b1431d15a978ef85cf1dbfdce4"} Mar 20 17:34:54 crc kubenswrapper[4803]: I0320 17:34:54.989891 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xfvq7" podStartSLOduration=5.989869859 podStartE2EDuration="5.989869859s" podCreationTimestamp="2026-03-20 17:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:54.986305913 +0000 UTC m=+1104.897897983" watchObservedRunningTime="2026-03-20 17:34:54.989869859 +0000 UTC m=+1104.901461949" Mar 20 17:34:55 crc kubenswrapper[4803]: W0320 17:34:55.001878 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded2a5efb_ed66_4e83_96e9_12ae1bf75905.slice/crio-7b5232fa19e4da138a626ca0344be4d0ddc2f28144ddb103c25350a02af16e63 WatchSource:0}: Error finding container 7b5232fa19e4da138a626ca0344be4d0ddc2f28144ddb103c25350a02af16e63: Status 404 returned error can't find the container with id 7b5232fa19e4da138a626ca0344be4d0ddc2f28144ddb103c25350a02af16e63 Mar 20 17:34:55 crc kubenswrapper[4803]: I0320 17:34:55.232420 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-799cb79944-p24nt"] Mar 20 17:34:55 crc kubenswrapper[4803]: W0320 17:34:55.348500 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce4ab454_b5e3_435e_b406_ee5891a82b69.slice/crio-ed79709addf6e0fc81955e356bba5d9b39283203eec242bee81945b7efc0f246 WatchSource:0}: Error finding container ed79709addf6e0fc81955e356bba5d9b39283203eec242bee81945b7efc0f246: Status 404 returned error can't find the container with id ed79709addf6e0fc81955e356bba5d9b39283203eec242bee81945b7efc0f246 Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.019992 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-799cb79944-p24nt" event={"ID":"ce4ab454-b5e3-435e-b406-ee5891a82b69","Type":"ContainerStarted","Data":"742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d"} Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.020325 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-799cb79944-p24nt" event={"ID":"ce4ab454-b5e3-435e-b406-ee5891a82b69","Type":"ContainerStarted","Data":"3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e"} Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.020348 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.020362 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-799cb79944-p24nt" event={"ID":"ce4ab454-b5e3-435e-b406-ee5891a82b69","Type":"ContainerStarted","Data":"ed79709addf6e0fc81955e356bba5d9b39283203eec242bee81945b7efc0f246"} Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.023911 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c76885c69-tg6gl"] Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.025399 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.028483 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.028688 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.034771 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c76885c69-tg6gl"] Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.048936 4803 generic.go:334] "Generic (PLEG): container finished" podID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerID="c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01" exitCode=0 Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.049014 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" event={"ID":"ed2a5efb-ed66-4e83-96e9-12ae1bf75905","Type":"ContainerDied","Data":"c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01"} Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.049038 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" event={"ID":"ed2a5efb-ed66-4e83-96e9-12ae1bf75905","Type":"ContainerStarted","Data":"7b5232fa19e4da138a626ca0344be4d0ddc2f28144ddb103c25350a02af16e63"} Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.059026 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"824aff30-6d5e-4489-bf27-79910aafe31e","Type":"ContainerStarted","Data":"fbee115a69c076117905f8a61c000191d7b543588ff6c357136ee8e193540c80"} Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.061403 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f22bff9a-80b3-41a7-9f0d-337b7c61b46a","Type":"ContainerStarted","Data":"df685bb06f31c910c8919b80d9b49a695a69dca779ad626488ca56799fd989e1"} Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.061450 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f22bff9a-80b3-41a7-9f0d-337b7c61b46a","Type":"ContainerStarted","Data":"93e99510fbb0104bf49c97406e8d9871b96dea2647d12a473e268f7f79abc7e3"} Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.082413 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-799cb79944-p24nt" podStartSLOduration=2.082395913 podStartE2EDuration="2.082395913s" podCreationTimestamp="2026-03-20 17:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:56.049943626 +0000 UTC m=+1105.961535696" watchObservedRunningTime="2026-03-20 17:34:56.082395913 +0000 UTC m=+1105.993987983" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.118790 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-internal-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.118890 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlqx9\" (UniqueName: \"kubernetes.io/projected/6570dd6e-761d-4d3d-99e8-5dfaba169520-kube-api-access-jlqx9\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.118938 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-combined-ca-bundle\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.118984 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-public-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.119006 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-httpd-config\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.119375 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-ovndb-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.119419 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-config\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.145483 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=13.145461338 podStartE2EDuration="13.145461338s" podCreationTimestamp="2026-03-20 17:34:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:56.140873754 +0000 UTC m=+1106.052465824" watchObservedRunningTime="2026-03-20 17:34:56.145461338 +0000 UTC m=+1106.057053398" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.229305 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-ovndb-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.229375 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-config\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.229467 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-internal-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.229531 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlqx9\" (UniqueName: \"kubernetes.io/projected/6570dd6e-761d-4d3d-99e8-5dfaba169520-kube-api-access-jlqx9\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.237390 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-config\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.239204 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-ovndb-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.239629 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-combined-ca-bundle\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.239751 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-public-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.239807 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-httpd-config\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.261448 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-internal-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.268034 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-combined-ca-bundle\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.268083 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-httpd-config\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.268690 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-public-tls-certs\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.275081 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlqx9\" (UniqueName: \"kubernetes.io/projected/6570dd6e-761d-4d3d-99e8-5dfaba169520-kube-api-access-jlqx9\") pod \"neutron-7c76885c69-tg6gl\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:56 crc kubenswrapper[4803]: I0320 17:34:56.357911 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:34:57 crc kubenswrapper[4803]: I0320 17:34:57.077335 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" event={"ID":"ed2a5efb-ed66-4e83-96e9-12ae1bf75905","Type":"ContainerStarted","Data":"1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6"} Mar 20 17:34:57 crc kubenswrapper[4803]: I0320 17:34:57.078347 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:34:57 crc kubenswrapper[4803]: I0320 17:34:57.099001 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" podStartSLOduration=4.098986225 podStartE2EDuration="4.098986225s" podCreationTimestamp="2026-03-20 17:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:57.098879732 +0000 UTC m=+1107.010471822" watchObservedRunningTime="2026-03-20 17:34:57.098986225 +0000 UTC m=+1107.010578285" Mar 20 17:34:57 crc kubenswrapper[4803]: I0320 17:34:57.131552 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c76885c69-tg6gl"] Mar 20 17:34:57 crc kubenswrapper[4803]: W0320 17:34:57.134447 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6570dd6e_761d_4d3d_99e8_5dfaba169520.slice/crio-de3c35d0ca5dc5f020a51eb767b69c50a6905108c2766cc71b3e81988994154f WatchSource:0}: Error finding container de3c35d0ca5dc5f020a51eb767b69c50a6905108c2766cc71b3e81988994154f: Status 404 returned error can't find the container with id de3c35d0ca5dc5f020a51eb767b69c50a6905108c2766cc71b3e81988994154f Mar 20 17:34:58 crc kubenswrapper[4803]: E0320 17:34:58.040176 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6d002f5_379f_4709_a3df_aeb253a8884b.slice/crio-fe13e8955f038320902fb18b2c56d5aa4a644899a2f52e28c950ffdc85579fe7.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:34:58 crc kubenswrapper[4803]: I0320 17:34:58.098319 4803 generic.go:334] "Generic (PLEG): container finished" podID="e6d002f5-379f-4709-a3df-aeb253a8884b" containerID="fe13e8955f038320902fb18b2c56d5aa4a644899a2f52e28c950ffdc85579fe7" exitCode=0 Mar 20 17:34:58 crc kubenswrapper[4803]: I0320 17:34:58.099495 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-75rtg" event={"ID":"e6d002f5-379f-4709-a3df-aeb253a8884b","Type":"ContainerDied","Data":"fe13e8955f038320902fb18b2c56d5aa4a644899a2f52e28c950ffdc85579fe7"} Mar 20 17:34:58 crc kubenswrapper[4803]: I0320 17:34:58.110516 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c76885c69-tg6gl" event={"ID":"6570dd6e-761d-4d3d-99e8-5dfaba169520","Type":"ContainerStarted","Data":"0ded043744782977e6c1dc9d31e30bbcb0eeaec113bae6bf8f1a46cd831b848a"} Mar 20 17:34:58 crc kubenswrapper[4803]: I0320 17:34:58.110749 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c76885c69-tg6gl" event={"ID":"6570dd6e-761d-4d3d-99e8-5dfaba169520","Type":"ContainerStarted","Data":"de3c35d0ca5dc5f020a51eb767b69c50a6905108c2766cc71b3e81988994154f"} Mar 20 17:34:58 crc kubenswrapper[4803]: I0320 17:34:58.124629 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f22bff9a-80b3-41a7-9f0d-337b7c61b46a","Type":"ContainerStarted","Data":"c3e79caec9548dc0bbc047750096a62108fde9eb28fbc970e8a22062d4aad3b6"} Mar 20 17:34:58 crc kubenswrapper[4803]: I0320 17:34:58.152981 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.152962978 podStartE2EDuration="15.152962978s" podCreationTimestamp="2026-03-20 17:34:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:34:58.146789871 +0000 UTC m=+1108.058381951" watchObservedRunningTime="2026-03-20 17:34:58.152962978 +0000 UTC m=+1108.064555048" Mar 20 17:35:01 crc kubenswrapper[4803]: I0320 17:35:01.662734 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:35:01 crc kubenswrapper[4803]: I0320 17:35:01.663265 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:35:01 crc kubenswrapper[4803]: I0320 17:35:01.724508 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:35:01 crc kubenswrapper[4803]: I0320 17:35:01.724647 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.525887 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-75rtg" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.594426 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c94gp\" (UniqueName: \"kubernetes.io/projected/e6d002f5-379f-4709-a3df-aeb253a8884b-kube-api-access-c94gp\") pod \"e6d002f5-379f-4709-a3df-aeb253a8884b\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.594498 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-combined-ca-bundle\") pod \"e6d002f5-379f-4709-a3df-aeb253a8884b\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.594583 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d002f5-379f-4709-a3df-aeb253a8884b-logs\") pod \"e6d002f5-379f-4709-a3df-aeb253a8884b\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.594623 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-config-data\") pod \"e6d002f5-379f-4709-a3df-aeb253a8884b\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.594701 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-scripts\") pod \"e6d002f5-379f-4709-a3df-aeb253a8884b\" (UID: \"e6d002f5-379f-4709-a3df-aeb253a8884b\") " Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.596342 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d002f5-379f-4709-a3df-aeb253a8884b-logs" (OuterVolumeSpecName: "logs") pod "e6d002f5-379f-4709-a3df-aeb253a8884b" (UID: "e6d002f5-379f-4709-a3df-aeb253a8884b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.615556 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-scripts" (OuterVolumeSpecName: "scripts") pod "e6d002f5-379f-4709-a3df-aeb253a8884b" (UID: "e6d002f5-379f-4709-a3df-aeb253a8884b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.617899 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d002f5-379f-4709-a3df-aeb253a8884b-kube-api-access-c94gp" (OuterVolumeSpecName: "kube-api-access-c94gp") pod "e6d002f5-379f-4709-a3df-aeb253a8884b" (UID: "e6d002f5-379f-4709-a3df-aeb253a8884b"). InnerVolumeSpecName "kube-api-access-c94gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.636419 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-config-data" (OuterVolumeSpecName: "config-data") pod "e6d002f5-379f-4709-a3df-aeb253a8884b" (UID: "e6d002f5-379f-4709-a3df-aeb253a8884b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.641096 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6d002f5-379f-4709-a3df-aeb253a8884b" (UID: "e6d002f5-379f-4709-a3df-aeb253a8884b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.696766 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.696791 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e6d002f5-379f-4709-a3df-aeb253a8884b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.696801 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.696811 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6d002f5-379f-4709-a3df-aeb253a8884b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:03 crc kubenswrapper[4803]: I0320 17:35:03.696820 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c94gp\" (UniqueName: \"kubernetes.io/projected/e6d002f5-379f-4709-a3df-aeb253a8884b-kube-api-access-c94gp\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.054855 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.054893 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.070131 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.070164 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.088331 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.097350 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.120217 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.139310 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.186041 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-75rtg" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.187132 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-75rtg" event={"ID":"e6d002f5-379f-4709-a3df-aeb253a8884b","Type":"ContainerDied","Data":"408aa3be400ceeca85699119448e316dabeb5a85fb997e96d16381cbc6033444"} Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.187178 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="408aa3be400ceeca85699119448e316dabeb5a85fb997e96d16381cbc6033444" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.188794 4803 generic.go:334] "Generic (PLEG): container finished" podID="a8d182f2-4d18-46d2-b9a2-349bf6ddb311" containerID="ca6c4f87fdaed092282729f2a14553e6dc9d18b1431d15a978ef85cf1dbfdce4" exitCode=0 Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.190012 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xfvq7" event={"ID":"a8d182f2-4d18-46d2-b9a2-349bf6ddb311","Type":"ContainerDied","Data":"ca6c4f87fdaed092282729f2a14553e6dc9d18b1431d15a978ef85cf1dbfdce4"} Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.190048 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.192191 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.192219 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.192232 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.328689 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.394414 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7dldg"] Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.395113 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" podUID="e163614d-669d-44cf-93bd-6e6107dcf86e" containerName="dnsmasq-dns" containerID="cri-o://ad44d3471a9d026f800a7817071234aecf7531430d1475413ebefb3994ffda17" gracePeriod=10 Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.696651 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68bd594944-fvgn8"] Mar 20 17:35:04 crc kubenswrapper[4803]: E0320 17:35:04.697276 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d002f5-379f-4709-a3df-aeb253a8884b" containerName="placement-db-sync" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.697290 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d002f5-379f-4709-a3df-aeb253a8884b" containerName="placement-db-sync" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.697629 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d002f5-379f-4709-a3df-aeb253a8884b" containerName="placement-db-sync" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.699169 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.704083 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-pxj2c" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.704628 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.704786 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.705410 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.705654 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.741164 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68bd594944-fvgn8"] Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.822943 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-scripts\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.823038 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-combined-ca-bundle\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.823083 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zblsj\" (UniqueName: \"kubernetes.io/projected/c7e954db-f75f-4ab2-93af-c3f0738748ac-kube-api-access-zblsj\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.823160 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-internal-tls-certs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.823235 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-public-tls-certs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.823322 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-config-data\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.823361 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e954db-f75f-4ab2-93af-c3f0738748ac-logs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.924982 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-scripts\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.925033 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-combined-ca-bundle\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.925052 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zblsj\" (UniqueName: \"kubernetes.io/projected/c7e954db-f75f-4ab2-93af-c3f0738748ac-kube-api-access-zblsj\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.925108 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-internal-tls-certs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.925160 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-public-tls-certs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.925201 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-config-data\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.925236 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e954db-f75f-4ab2-93af-c3f0738748ac-logs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.925637 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e954db-f75f-4ab2-93af-c3f0738748ac-logs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.930442 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-scripts\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.930746 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-internal-tls-certs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.931083 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-config-data\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.933449 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-public-tls-certs\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.939274 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-combined-ca-bundle\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:04 crc kubenswrapper[4803]: I0320 17:35:04.966511 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zblsj\" (UniqueName: \"kubernetes.io/projected/c7e954db-f75f-4ab2-93af-c3f0738748ac-kube-api-access-zblsj\") pod \"placement-68bd594944-fvgn8\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:05 crc kubenswrapper[4803]: I0320 17:35:05.042323 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:05 crc kubenswrapper[4803]: I0320 17:35:05.197408 4803 generic.go:334] "Generic (PLEG): container finished" podID="e163614d-669d-44cf-93bd-6e6107dcf86e" containerID="ad44d3471a9d026f800a7817071234aecf7531430d1475413ebefb3994ffda17" exitCode=0 Mar 20 17:35:05 crc kubenswrapper[4803]: I0320 17:35:05.197505 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" event={"ID":"e163614d-669d-44cf-93bd-6e6107dcf86e","Type":"ContainerDied","Data":"ad44d3471a9d026f800a7817071234aecf7531430d1475413ebefb3994ffda17"} Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.206897 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.206920 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.207673 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.207685 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.546686 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.567761 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.654051 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-scripts\") pod \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.654130 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-combined-ca-bundle\") pod \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.654175 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-fernet-keys\") pod \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.654190 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-credential-keys\") pod \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.654223 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcbp2\" (UniqueName: \"kubernetes.io/projected/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-kube-api-access-dcbp2\") pod \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.654364 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-config-data\") pod \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\" (UID: \"a8d182f2-4d18-46d2-b9a2-349bf6ddb311\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.660157 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-kube-api-access-dcbp2" (OuterVolumeSpecName: "kube-api-access-dcbp2") pod "a8d182f2-4d18-46d2-b9a2-349bf6ddb311" (UID: "a8d182f2-4d18-46d2-b9a2-349bf6ddb311"). InnerVolumeSpecName "kube-api-access-dcbp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.666981 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-scripts" (OuterVolumeSpecName: "scripts") pod "a8d182f2-4d18-46d2-b9a2-349bf6ddb311" (UID: "a8d182f2-4d18-46d2-b9a2-349bf6ddb311"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.670998 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a8d182f2-4d18-46d2-b9a2-349bf6ddb311" (UID: "a8d182f2-4d18-46d2-b9a2-349bf6ddb311"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.672579 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a8d182f2-4d18-46d2-b9a2-349bf6ddb311" (UID: "a8d182f2-4d18-46d2-b9a2-349bf6ddb311"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.695930 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-config-data" (OuterVolumeSpecName: "config-data") pod "a8d182f2-4d18-46d2-b9a2-349bf6ddb311" (UID: "a8d182f2-4d18-46d2-b9a2-349bf6ddb311"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.735665 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8d182f2-4d18-46d2-b9a2-349bf6ddb311" (UID: "a8d182f2-4d18-46d2-b9a2-349bf6ddb311"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.757253 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.757285 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.757295 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.757304 4803 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.757312 4803 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.757321 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcbp2\" (UniqueName: \"kubernetes.io/projected/a8d182f2-4d18-46d2-b9a2-349bf6ddb311-kube-api-access-dcbp2\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.783301 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.858933 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-nb\") pod \"e163614d-669d-44cf-93bd-6e6107dcf86e\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.858998 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-swift-storage-0\") pod \"e163614d-669d-44cf-93bd-6e6107dcf86e\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.859025 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-config\") pod \"e163614d-669d-44cf-93bd-6e6107dcf86e\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.859162 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-svc\") pod \"e163614d-669d-44cf-93bd-6e6107dcf86e\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.859258 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69wjk\" (UniqueName: \"kubernetes.io/projected/e163614d-669d-44cf-93bd-6e6107dcf86e-kube-api-access-69wjk\") pod \"e163614d-669d-44cf-93bd-6e6107dcf86e\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.859284 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-sb\") pod \"e163614d-669d-44cf-93bd-6e6107dcf86e\" (UID: \"e163614d-669d-44cf-93bd-6e6107dcf86e\") " Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.868692 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e163614d-669d-44cf-93bd-6e6107dcf86e-kube-api-access-69wjk" (OuterVolumeSpecName: "kube-api-access-69wjk") pod "e163614d-669d-44cf-93bd-6e6107dcf86e" (UID: "e163614d-669d-44cf-93bd-6e6107dcf86e"). InnerVolumeSpecName "kube-api-access-69wjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.895625 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:06 crc kubenswrapper[4803]: I0320 17:35:06.961548 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69wjk\" (UniqueName: \"kubernetes.io/projected/e163614d-669d-44cf-93bd-6e6107dcf86e-kube-api-access-69wjk\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.039448 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e163614d-669d-44cf-93bd-6e6107dcf86e" (UID: "e163614d-669d-44cf-93bd-6e6107dcf86e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.069849 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.084115 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e163614d-669d-44cf-93bd-6e6107dcf86e" (UID: "e163614d-669d-44cf-93bd-6e6107dcf86e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.093923 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e163614d-669d-44cf-93bd-6e6107dcf86e" (UID: "e163614d-669d-44cf-93bd-6e6107dcf86e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.101166 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-config" (OuterVolumeSpecName: "config") pod "e163614d-669d-44cf-93bd-6e6107dcf86e" (UID: "e163614d-669d-44cf-93bd-6e6107dcf86e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.104013 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e163614d-669d-44cf-93bd-6e6107dcf86e" (UID: "e163614d-669d-44cf-93bd-6e6107dcf86e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.171748 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.171777 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.171786 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.171797 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e163614d-669d-44cf-93bd-6e6107dcf86e-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.239122 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68bd594944-fvgn8"] Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.243437 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" event={"ID":"e163614d-669d-44cf-93bd-6e6107dcf86e","Type":"ContainerDied","Data":"b9435c8155ba4d34e276c3846071d571227d59578e317f2b65f6b71403ad86f6"} Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.243484 4803 scope.go:117] "RemoveContainer" containerID="ad44d3471a9d026f800a7817071234aecf7531430d1475413ebefb3994ffda17" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.243627 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-7dldg" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.255331 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40","Type":"ContainerStarted","Data":"48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d"} Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.270212 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c76885c69-tg6gl" event={"ID":"6570dd6e-761d-4d3d-99e8-5dfaba169520","Type":"ContainerStarted","Data":"96a6e4bc9c87ace754b9960b446e01b6d5898a11961cc4f8d2b48a5403ef2751"} Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.270361 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.281129 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xm2fq" event={"ID":"1f3de51a-19ff-4714-b839-921efeeb3e48","Type":"ContainerStarted","Data":"b94a88755bb30e8fd224eaa50df2b53d3a5138c0c490af81802bb0e41335d468"} Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.283458 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7dldg"] Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.290893 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-7dldg"] Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.295118 4803 scope.go:117] "RemoveContainer" containerID="1e928c9d8c104b8f2d47f898ce22e1603870f5b9fa5b61353dfe75654422527d" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.302305 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xfvq7" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.302858 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xfvq7" event={"ID":"a8d182f2-4d18-46d2-b9a2-349bf6ddb311","Type":"ContainerDied","Data":"9c869c5a74774d40ef3c3c707ee0e15c1098e7ba37c84b9d414ad238e63301f8"} Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.302887 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c869c5a74774d40ef3c3c707ee0e15c1098e7ba37c84b9d414ad238e63301f8" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.306900 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c76885c69-tg6gl" podStartSLOduration=12.306882622 podStartE2EDuration="12.306882622s" podCreationTimestamp="2026-03-20 17:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:07.296228684 +0000 UTC m=+1117.207820764" watchObservedRunningTime="2026-03-20 17:35:07.306882622 +0000 UTC m=+1117.218474692" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.315316 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-xm2fq" podStartSLOduration=2.4967971110000002 podStartE2EDuration="45.315303619s" podCreationTimestamp="2026-03-20 17:34:22 +0000 UTC" firstStartedPulling="2026-03-20 17:34:23.975034346 +0000 UTC m=+1073.886626416" lastFinishedPulling="2026-03-20 17:35:06.793540854 +0000 UTC m=+1116.705132924" observedRunningTime="2026-03-20 17:35:07.314819566 +0000 UTC m=+1117.226411626" watchObservedRunningTime="2026-03-20 17:35:07.315303619 +0000 UTC m=+1117.226895689" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.417378 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.417509 4803 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.592153 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.711207 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-85847589bb-9pbbf"] Mar 20 17:35:07 crc kubenswrapper[4803]: E0320 17:35:07.711657 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e163614d-669d-44cf-93bd-6e6107dcf86e" containerName="init" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.711674 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e163614d-669d-44cf-93bd-6e6107dcf86e" containerName="init" Mar 20 17:35:07 crc kubenswrapper[4803]: E0320 17:35:07.711689 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d182f2-4d18-46d2-b9a2-349bf6ddb311" containerName="keystone-bootstrap" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.711695 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d182f2-4d18-46d2-b9a2-349bf6ddb311" containerName="keystone-bootstrap" Mar 20 17:35:07 crc kubenswrapper[4803]: E0320 17:35:07.711706 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e163614d-669d-44cf-93bd-6e6107dcf86e" containerName="dnsmasq-dns" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.711713 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e163614d-669d-44cf-93bd-6e6107dcf86e" containerName="dnsmasq-dns" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.711883 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e163614d-669d-44cf-93bd-6e6107dcf86e" containerName="dnsmasq-dns" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.711902 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d182f2-4d18-46d2-b9a2-349bf6ddb311" containerName="keystone-bootstrap" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.712404 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.719293 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.721715 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85847589bb-9pbbf"] Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.721805 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.723782 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-q8lcw" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.724873 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.725604 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.725740 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.786977 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-fernet-keys\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.787021 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-combined-ca-bundle\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.787091 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7bl5\" (UniqueName: \"kubernetes.io/projected/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-kube-api-access-c7bl5\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.787119 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-config-data\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.787138 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-scripts\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.787154 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-public-tls-certs\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.787186 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-credential-keys\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.787230 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-internal-tls-certs\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.889116 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7bl5\" (UniqueName: \"kubernetes.io/projected/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-kube-api-access-c7bl5\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.889162 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-config-data\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.889199 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-scripts\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.889222 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-public-tls-certs\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.889281 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-credential-keys\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.889323 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-internal-tls-certs\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.889374 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-fernet-keys\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.889390 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-combined-ca-bundle\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.895584 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-scripts\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.895760 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-internal-tls-certs\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.896098 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-combined-ca-bundle\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.897890 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-credential-keys\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.898338 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-fernet-keys\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.898429 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-public-tls-certs\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.898676 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-config-data\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:07 crc kubenswrapper[4803]: I0320 17:35:07.906459 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7bl5\" (UniqueName: \"kubernetes.io/projected/42bf225e-cdca-4bc8-922d-8ff2bcb6ff17-kube-api-access-c7bl5\") pod \"keystone-85847589bb-9pbbf\" (UID: \"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17\") " pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.065269 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.205662 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5984b77b84-4dhqv"] Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.206996 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.230648 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5984b77b84-4dhqv"] Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.296849 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-config-data\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.296949 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ebc450-0e00-492b-a5ab-f02c63aa4071-logs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.297018 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-internal-tls-certs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.297052 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdt8v\" (UniqueName: \"kubernetes.io/projected/d3ebc450-0e00-492b-a5ab-f02c63aa4071-kube-api-access-xdt8v\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.297086 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-scripts\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.297110 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-combined-ca-bundle\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.297150 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-public-tls-certs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.317636 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dtdjf" event={"ID":"56c1156a-5e7a-4547-8a7b-46a55651b7a7","Type":"ContainerStarted","Data":"097c610e66be84a58a9675df68e757e52647d5dbc7c78f7a7f93c2338a1b9104"} Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.333441 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bd594944-fvgn8" event={"ID":"c7e954db-f75f-4ab2-93af-c3f0738748ac","Type":"ContainerStarted","Data":"af5b4baeba5181ab821cd4086bfa7f3a4696767e199b4932917faa1367122571"} Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.333489 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.333502 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bd594944-fvgn8" event={"ID":"c7e954db-f75f-4ab2-93af-c3f0738748ac","Type":"ContainerStarted","Data":"6ab2cf5f45350281da36b5c6137444d215cbfd248825c7960db825bb0ae4aaa9"} Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.333514 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bd594944-fvgn8" event={"ID":"c7e954db-f75f-4ab2-93af-c3f0738748ac","Type":"ContainerStarted","Data":"fb545e96b631a2d935922a41a38694bf2f88a00a41527992a85d2e3be6b73b3b"} Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.334138 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.349352 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dtdjf" podStartSLOduration=3.535603764 podStartE2EDuration="46.349337743s" podCreationTimestamp="2026-03-20 17:34:22 +0000 UTC" firstStartedPulling="2026-03-20 17:34:23.975346044 +0000 UTC m=+1073.886938114" lastFinishedPulling="2026-03-20 17:35:06.789080023 +0000 UTC m=+1116.700672093" observedRunningTime="2026-03-20 17:35:08.333839494 +0000 UTC m=+1118.245431564" watchObservedRunningTime="2026-03-20 17:35:08.349337743 +0000 UTC m=+1118.260929813" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.367134 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-68bd594944-fvgn8" podStartSLOduration=4.367114214 podStartE2EDuration="4.367114214s" podCreationTimestamp="2026-03-20 17:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:08.357101303 +0000 UTC m=+1118.268693373" watchObservedRunningTime="2026-03-20 17:35:08.367114214 +0000 UTC m=+1118.278706284" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.398428 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-public-tls-certs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.398550 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-config-data\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.398676 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ebc450-0e00-492b-a5ab-f02c63aa4071-logs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.398771 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-internal-tls-certs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.398801 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdt8v\" (UniqueName: \"kubernetes.io/projected/d3ebc450-0e00-492b-a5ab-f02c63aa4071-kube-api-access-xdt8v\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.398851 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-scripts\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.398868 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-combined-ca-bundle\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.402224 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3ebc450-0e00-492b-a5ab-f02c63aa4071-logs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.416146 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-scripts\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.417293 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdt8v\" (UniqueName: \"kubernetes.io/projected/d3ebc450-0e00-492b-a5ab-f02c63aa4071-kube-api-access-xdt8v\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.417756 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-internal-tls-certs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.418997 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-config-data\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.420051 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-combined-ca-bundle\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.433281 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ebc450-0e00-492b-a5ab-f02c63aa4071-public-tls-certs\") pod \"placement-5984b77b84-4dhqv\" (UID: \"d3ebc450-0e00-492b-a5ab-f02c63aa4071\") " pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.528687 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.639024 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-85847589bb-9pbbf"] Mar 20 17:35:08 crc kubenswrapper[4803]: W0320 17:35:08.645139 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42bf225e_cdca_4bc8_922d_8ff2bcb6ff17.slice/crio-c25949dc312726d38bfa8aa085a0af6e52303e38f987ecf69ae9d276e1e6f754 WatchSource:0}: Error finding container c25949dc312726d38bfa8aa085a0af6e52303e38f987ecf69ae9d276e1e6f754: Status 404 returned error can't find the container with id c25949dc312726d38bfa8aa085a0af6e52303e38f987ecf69ae9d276e1e6f754 Mar 20 17:35:08 crc kubenswrapper[4803]: I0320 17:35:08.860091 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e163614d-669d-44cf-93bd-6e6107dcf86e" path="/var/lib/kubelet/pods/e163614d-669d-44cf-93bd-6e6107dcf86e/volumes" Mar 20 17:35:09 crc kubenswrapper[4803]: I0320 17:35:09.121204 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5984b77b84-4dhqv"] Mar 20 17:35:09 crc kubenswrapper[4803]: W0320 17:35:09.125087 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ebc450_0e00_492b_a5ab_f02c63aa4071.slice/crio-5a3542d3c96ac78b2e4d4ab07793aed8cbe0767abfe91a81b73c3c5de9f32eeb WatchSource:0}: Error finding container 5a3542d3c96ac78b2e4d4ab07793aed8cbe0767abfe91a81b73c3c5de9f32eeb: Status 404 returned error can't find the container with id 5a3542d3c96ac78b2e4d4ab07793aed8cbe0767abfe91a81b73c3c5de9f32eeb Mar 20 17:35:09 crc kubenswrapper[4803]: I0320 17:35:09.344359 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5984b77b84-4dhqv" event={"ID":"d3ebc450-0e00-492b-a5ab-f02c63aa4071","Type":"ContainerStarted","Data":"8f1379e7b769507b865d05a55d58053f728ecadf4e15cd098939d2f138e9b4ce"} Mar 20 17:35:09 crc kubenswrapper[4803]: I0320 17:35:09.344767 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5984b77b84-4dhqv" event={"ID":"d3ebc450-0e00-492b-a5ab-f02c63aa4071","Type":"ContainerStarted","Data":"5a3542d3c96ac78b2e4d4ab07793aed8cbe0767abfe91a81b73c3c5de9f32eeb"} Mar 20 17:35:09 crc kubenswrapper[4803]: I0320 17:35:09.346378 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85847589bb-9pbbf" event={"ID":"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17","Type":"ContainerStarted","Data":"998c8565c4d4cd3bde606ffcd1c93005e186050b407fcc137bb622b9cf070263"} Mar 20 17:35:09 crc kubenswrapper[4803]: I0320 17:35:09.346426 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-85847589bb-9pbbf" event={"ID":"42bf225e-cdca-4bc8-922d-8ff2bcb6ff17","Type":"ContainerStarted","Data":"c25949dc312726d38bfa8aa085a0af6e52303e38f987ecf69ae9d276e1e6f754"} Mar 20 17:35:09 crc kubenswrapper[4803]: I0320 17:35:09.346767 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:09 crc kubenswrapper[4803]: I0320 17:35:09.376559 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-85847589bb-9pbbf" podStartSLOduration=2.376516022 podStartE2EDuration="2.376516022s" podCreationTimestamp="2026-03-20 17:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:09.367160769 +0000 UTC m=+1119.278752859" watchObservedRunningTime="2026-03-20 17:35:09.376516022 +0000 UTC m=+1119.288108102" Mar 20 17:35:10 crc kubenswrapper[4803]: I0320 17:35:10.356359 4803 generic.go:334] "Generic (PLEG): container finished" podID="1f3de51a-19ff-4714-b839-921efeeb3e48" containerID="b94a88755bb30e8fd224eaa50df2b53d3a5138c0c490af81802bb0e41335d468" exitCode=0 Mar 20 17:35:10 crc kubenswrapper[4803]: I0320 17:35:10.356561 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xm2fq" event={"ID":"1f3de51a-19ff-4714-b839-921efeeb3e48","Type":"ContainerDied","Data":"b94a88755bb30e8fd224eaa50df2b53d3a5138c0c490af81802bb0e41335d468"} Mar 20 17:35:10 crc kubenswrapper[4803]: I0320 17:35:10.359618 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5984b77b84-4dhqv" event={"ID":"d3ebc450-0e00-492b-a5ab-f02c63aa4071","Type":"ContainerStarted","Data":"de7f280ea1528ed2a731f5190dfa231a279615e7b7e0597371a1c44e2abb9f8e"} Mar 20 17:35:10 crc kubenswrapper[4803]: I0320 17:35:10.359646 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:10 crc kubenswrapper[4803]: I0320 17:35:10.360070 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:10 crc kubenswrapper[4803]: I0320 17:35:10.400190 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5984b77b84-4dhqv" podStartSLOduration=2.400166954 podStartE2EDuration="2.400166954s" podCreationTimestamp="2026-03-20 17:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:10.389007793 +0000 UTC m=+1120.300599883" watchObservedRunningTime="2026-03-20 17:35:10.400166954 +0000 UTC m=+1120.311759024" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.663703 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-75b65b9966-sh4pn" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.697883 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.734017 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-596cfc5b56-w5pbk" podUID="76a924c8-a380-4ee2-a6ce-ac77f0979f24" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.775177 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-combined-ca-bundle\") pod \"1f3de51a-19ff-4714-b839-921efeeb3e48\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.775259 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-db-sync-config-data\") pod \"1f3de51a-19ff-4714-b839-921efeeb3e48\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.775324 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8bvt\" (UniqueName: \"kubernetes.io/projected/1f3de51a-19ff-4714-b839-921efeeb3e48-kube-api-access-v8bvt\") pod \"1f3de51a-19ff-4714-b839-921efeeb3e48\" (UID: \"1f3de51a-19ff-4714-b839-921efeeb3e48\") " Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.780901 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1f3de51a-19ff-4714-b839-921efeeb3e48" (UID: "1f3de51a-19ff-4714-b839-921efeeb3e48"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.781933 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f3de51a-19ff-4714-b839-921efeeb3e48-kube-api-access-v8bvt" (OuterVolumeSpecName: "kube-api-access-v8bvt") pod "1f3de51a-19ff-4714-b839-921efeeb3e48" (UID: "1f3de51a-19ff-4714-b839-921efeeb3e48"). InnerVolumeSpecName "kube-api-access-v8bvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.807497 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f3de51a-19ff-4714-b839-921efeeb3e48" (UID: "1f3de51a-19ff-4714-b839-921efeeb3e48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.877069 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8bvt\" (UniqueName: \"kubernetes.io/projected/1f3de51a-19ff-4714-b839-921efeeb3e48-kube-api-access-v8bvt\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.877098 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:11 crc kubenswrapper[4803]: I0320 17:35:11.877107 4803 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f3de51a-19ff-4714-b839-921efeeb3e48-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.419379 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-xm2fq" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.419665 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-xm2fq" event={"ID":"1f3de51a-19ff-4714-b839-921efeeb3e48","Type":"ContainerDied","Data":"66a5aa89991d90aa6401b0c301dcaf63ad66c3e099846be3479f5af663593e04"} Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.420227 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66a5aa89991d90aa6401b0c301dcaf63ad66c3e099846be3479f5af663593e04" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.667019 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6bfd9f5f6f-xmdsr"] Mar 20 17:35:12 crc kubenswrapper[4803]: E0320 17:35:12.667339 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f3de51a-19ff-4714-b839-921efeeb3e48" containerName="barbican-db-sync" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.667352 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3de51a-19ff-4714-b839-921efeeb3e48" containerName="barbican-db-sync" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.667562 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f3de51a-19ff-4714-b839-921efeeb3e48" containerName="barbican-db-sync" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.668392 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.671783 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.671946 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2fq6h" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.672061 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.703810 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6bfd9f5f6f-xmdsr"] Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.706689 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-config-data\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.706738 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-config-data-custom\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.706801 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-combined-ca-bundle\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.706901 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm5cx\" (UniqueName: \"kubernetes.io/projected/43090279-19af-4393-ab9a-1092aae61875-kube-api-access-dm5cx\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.706929 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43090279-19af-4393-ab9a-1092aae61875-logs\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.797057 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b99997f8b-mzkpj"] Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.798855 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.801316 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.808638 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-combined-ca-bundle\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.808737 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm5cx\" (UniqueName: \"kubernetes.io/projected/43090279-19af-4393-ab9a-1092aae61875-kube-api-access-dm5cx\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.808775 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43090279-19af-4393-ab9a-1092aae61875-logs\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.808831 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-config-data\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.808855 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-config-data-custom\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.813152 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43090279-19af-4393-ab9a-1092aae61875-logs\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.815878 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-config-data-custom\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.826699 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-4nrzk"] Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.828147 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.830246 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-config-data\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.841171 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43090279-19af-4393-ab9a-1092aae61875-combined-ca-bundle\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.872168 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm5cx\" (UniqueName: \"kubernetes.io/projected/43090279-19af-4393-ab9a-1092aae61875-kube-api-access-dm5cx\") pod \"barbican-worker-6bfd9f5f6f-xmdsr\" (UID: \"43090279-19af-4393-ab9a-1092aae61875\") " pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.876371 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b99997f8b-mzkpj"] Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.898593 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-4nrzk"] Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910187 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910295 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-config-data-custom\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910323 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-config\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910369 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-combined-ca-bundle\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910399 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-config-data\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910432 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910465 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910497 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910564 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2hw\" (UniqueName: \"kubernetes.io/projected/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-kube-api-access-8n2hw\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910619 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726567e8-cdfd-4fe4-985f-1cf10a787994-logs\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.910666 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4snnj\" (UniqueName: \"kubernetes.io/projected/726567e8-cdfd-4fe4-985f-1cf10a787994-kube-api-access-4snnj\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.933941 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-68dc9dd7c8-7lvg9"] Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.936806 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.940011 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 17:35:12 crc kubenswrapper[4803]: I0320 17:35:12.957646 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68dc9dd7c8-7lvg9"] Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.003969 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017333 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-config-data\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017376 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017398 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data-custom\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017413 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-combined-ca-bundle\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017437 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017462 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017484 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017530 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8njt\" (UniqueName: \"kubernetes.io/projected/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-kube-api-access-x8njt\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017548 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2hw\" (UniqueName: \"kubernetes.io/projected/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-kube-api-access-8n2hw\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017589 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726567e8-cdfd-4fe4-985f-1cf10a787994-logs\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017605 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-logs\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017632 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4snnj\" (UniqueName: \"kubernetes.io/projected/726567e8-cdfd-4fe4-985f-1cf10a787994-kube-api-access-4snnj\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017649 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017677 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-config-data-custom\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017698 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-config\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.017729 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-combined-ca-bundle\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.023240 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/726567e8-cdfd-4fe4-985f-1cf10a787994-logs\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.023911 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.024456 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.024949 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.025440 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.028102 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-config\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.028921 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-combined-ca-bundle\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.031146 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-config-data-custom\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.032085 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/726567e8-cdfd-4fe4-985f-1cf10a787994-config-data\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.056026 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4snnj\" (UniqueName: \"kubernetes.io/projected/726567e8-cdfd-4fe4-985f-1cf10a787994-kube-api-access-4snnj\") pod \"barbican-keystone-listener-5b99997f8b-mzkpj\" (UID: \"726567e8-cdfd-4fe4-985f-1cf10a787994\") " pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.060759 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2hw\" (UniqueName: \"kubernetes.io/projected/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-kube-api-access-8n2hw\") pod \"dnsmasq-dns-848cf88cfc-4nrzk\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.124795 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data-custom\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.124847 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-combined-ca-bundle\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.124922 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.124964 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8njt\" (UniqueName: \"kubernetes.io/projected/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-kube-api-access-x8njt\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.125032 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-logs\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.127152 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-logs\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.129750 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data-custom\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.131100 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-combined-ca-bundle\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.134438 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.149073 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8njt\" (UniqueName: \"kubernetes.io/projected/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-kube-api-access-x8njt\") pod \"barbican-api-68dc9dd7c8-7lvg9\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.289535 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.299543 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.306622 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.429979 4803 generic.go:334] "Generic (PLEG): container finished" podID="56c1156a-5e7a-4547-8a7b-46a55651b7a7" containerID="097c610e66be84a58a9675df68e757e52647d5dbc7c78f7a7f93c2338a1b9104" exitCode=0 Mar 20 17:35:13 crc kubenswrapper[4803]: I0320 17:35:13.430030 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dtdjf" event={"ID":"56c1156a-5e7a-4547-8a7b-46a55651b7a7","Type":"ContainerDied","Data":"097c610e66be84a58a9675df68e757e52647d5dbc7c78f7a7f93c2338a1b9104"} Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.403234 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-648ff8c756-h8wjt"] Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.405123 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.409157 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.411000 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.421938 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-648ff8c756-h8wjt"] Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.474773 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-internal-tls-certs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.474823 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcmn7\" (UniqueName: \"kubernetes.io/projected/deeb9bec-8bcb-48ef-ba67-b5772825f753-kube-api-access-hcmn7\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.474860 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-public-tls-certs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.474907 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-config-data-custom\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.474956 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-combined-ca-bundle\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.474981 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-config-data\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.475027 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deeb9bec-8bcb-48ef-ba67-b5772825f753-logs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.576995 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-public-tls-certs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.577073 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-config-data-custom\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.577132 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-combined-ca-bundle\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.577159 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-config-data\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.577210 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deeb9bec-8bcb-48ef-ba67-b5772825f753-logs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.577231 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-internal-tls-certs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.577252 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcmn7\" (UniqueName: \"kubernetes.io/projected/deeb9bec-8bcb-48ef-ba67-b5772825f753-kube-api-access-hcmn7\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.577758 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/deeb9bec-8bcb-48ef-ba67-b5772825f753-logs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.589230 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-combined-ca-bundle\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.599029 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-config-data-custom\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.600715 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-public-tls-certs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.603006 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-config-data\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.603499 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/deeb9bec-8bcb-48ef-ba67-b5772825f753-internal-tls-certs\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.606688 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcmn7\" (UniqueName: \"kubernetes.io/projected/deeb9bec-8bcb-48ef-ba67-b5772825f753-kube-api-access-hcmn7\") pod \"barbican-api-648ff8c756-h8wjt\" (UID: \"deeb9bec-8bcb-48ef-ba67-b5772825f753\") " pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:15 crc kubenswrapper[4803]: I0320 17:35:15.730231 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.090273 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.204159 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc45n\" (UniqueName: \"kubernetes.io/projected/56c1156a-5e7a-4547-8a7b-46a55651b7a7-kube-api-access-rc45n\") pod \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.204281 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56c1156a-5e7a-4547-8a7b-46a55651b7a7-etc-machine-id\") pod \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.204376 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-db-sync-config-data\") pod \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.204406 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/56c1156a-5e7a-4547-8a7b-46a55651b7a7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "56c1156a-5e7a-4547-8a7b-46a55651b7a7" (UID: "56c1156a-5e7a-4547-8a7b-46a55651b7a7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.204419 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-config-data\") pod \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.204505 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-combined-ca-bundle\") pod \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.204558 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-scripts\") pod \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\" (UID: \"56c1156a-5e7a-4547-8a7b-46a55651b7a7\") " Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.205431 4803 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/56c1156a-5e7a-4547-8a7b-46a55651b7a7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.209658 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "56c1156a-5e7a-4547-8a7b-46a55651b7a7" (UID: "56c1156a-5e7a-4547-8a7b-46a55651b7a7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.211539 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c1156a-5e7a-4547-8a7b-46a55651b7a7-kube-api-access-rc45n" (OuterVolumeSpecName: "kube-api-access-rc45n") pod "56c1156a-5e7a-4547-8a7b-46a55651b7a7" (UID: "56c1156a-5e7a-4547-8a7b-46a55651b7a7"). InnerVolumeSpecName "kube-api-access-rc45n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.221730 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-scripts" (OuterVolumeSpecName: "scripts") pod "56c1156a-5e7a-4547-8a7b-46a55651b7a7" (UID: "56c1156a-5e7a-4547-8a7b-46a55651b7a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.229437 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56c1156a-5e7a-4547-8a7b-46a55651b7a7" (UID: "56c1156a-5e7a-4547-8a7b-46a55651b7a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.260284 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-config-data" (OuterVolumeSpecName: "config-data") pod "56c1156a-5e7a-4547-8a7b-46a55651b7a7" (UID: "56c1156a-5e7a-4547-8a7b-46a55651b7a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.306960 4803 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.306999 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.307009 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.307018 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/56c1156a-5e7a-4547-8a7b-46a55651b7a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.307027 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc45n\" (UniqueName: \"kubernetes.io/projected/56c1156a-5e7a-4547-8a7b-46a55651b7a7-kube-api-access-rc45n\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.482540 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dtdjf" event={"ID":"56c1156a-5e7a-4547-8a7b-46a55651b7a7","Type":"ContainerDied","Data":"da37800cc614d82ce9be4b4f78ec36278003861543cd93fa77273b13e36bc53e"} Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.482583 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da37800cc614d82ce9be4b4f78ec36278003861543cd93fa77273b13e36bc53e" Mar 20 17:35:17 crc kubenswrapper[4803]: I0320 17:35:17.482639 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dtdjf" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.337955 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-648ff8c756-h8wjt"] Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.361956 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:18 crc kubenswrapper[4803]: E0320 17:35:18.362303 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c1156a-5e7a-4547-8a7b-46a55651b7a7" containerName="cinder-db-sync" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.362318 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c1156a-5e7a-4547-8a7b-46a55651b7a7" containerName="cinder-db-sync" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.362489 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c1156a-5e7a-4547-8a7b-46a55651b7a7" containerName="cinder-db-sync" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.363323 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.366189 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.366514 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.366661 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-bbgpq" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.367956 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.441425 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/308526b2-6f01-494f-97c8-67e2af66a463-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.441538 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-scripts\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.441694 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.442019 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.442149 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.442186 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlkhf\" (UniqueName: \"kubernetes.io/projected/308526b2-6f01-494f-97c8-67e2af66a463-kube-api-access-rlkhf\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.442457 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.484334 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-4nrzk"] Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.501216 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cz888"] Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.503660 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.511935 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cz888"] Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546443 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546482 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546546 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-config\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546566 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546584 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlkhf\" (UniqueName: \"kubernetes.io/projected/308526b2-6f01-494f-97c8-67e2af66a463-kube-api-access-rlkhf\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546618 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-svc\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546637 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/308526b2-6f01-494f-97c8-67e2af66a463-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546669 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-scripts\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546701 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546723 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdm66\" (UniqueName: \"kubernetes.io/projected/22a4b5ea-5c7b-4b27-bacf-806ef044a297-kube-api-access-pdm66\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546759 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.546780 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.549810 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/308526b2-6f01-494f-97c8-67e2af66a463-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.556793 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-scripts\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.557326 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.561977 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.565300 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlkhf\" (UniqueName: \"kubernetes.io/projected/308526b2-6f01-494f-97c8-67e2af66a463-kube-api-access-rlkhf\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.565646 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.567227 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.568414 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.568998 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.575707 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.647985 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.648422 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649193 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfdkt\" (UniqueName: \"kubernetes.io/projected/2ca4624d-9e00-4f51-8046-a1a5b01abf31-kube-api-access-nfdkt\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649252 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649279 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-config\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649300 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ca4624d-9e00-4f51-8046-a1a5b01abf31-logs\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649326 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649346 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca4624d-9e00-4f51-8046-a1a5b01abf31-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649365 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-svc\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649389 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data-custom\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.648969 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.649126 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.650203 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-config\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.650850 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-svc\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.650976 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-scripts\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.651010 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.651038 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdm66\" (UniqueName: \"kubernetes.io/projected/22a4b5ea-5c7b-4b27-bacf-806ef044a297-kube-api-access-pdm66\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.652311 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.670164 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdm66\" (UniqueName: \"kubernetes.io/projected/22a4b5ea-5c7b-4b27-bacf-806ef044a297-kube-api-access-pdm66\") pod \"dnsmasq-dns-6578955fd5-cz888\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.713968 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.753079 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.753154 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ca4624d-9e00-4f51-8046-a1a5b01abf31-logs\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.753181 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.753200 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca4624d-9e00-4f51-8046-a1a5b01abf31-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.753228 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data-custom\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.753257 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-scripts\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.753312 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfdkt\" (UniqueName: \"kubernetes.io/projected/2ca4624d-9e00-4f51-8046-a1a5b01abf31-kube-api-access-nfdkt\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.754056 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca4624d-9e00-4f51-8046-a1a5b01abf31-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.754334 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ca4624d-9e00-4f51-8046-a1a5b01abf31-logs\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.756859 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.758831 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.760098 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data-custom\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.778209 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-scripts\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.784093 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfdkt\" (UniqueName: \"kubernetes.io/projected/2ca4624d-9e00-4f51-8046-a1a5b01abf31-kube-api-access-nfdkt\") pod \"cinder-api-0\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " pod="openstack/cinder-api-0" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.841910 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:18 crc kubenswrapper[4803]: I0320 17:35:18.958437 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:35:19 crc kubenswrapper[4803]: I0320 17:35:19.336119 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68dc9dd7c8-7lvg9"] Mar 20 17:35:19 crc kubenswrapper[4803]: I0320 17:35:19.412472 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6bfd9f5f6f-xmdsr"] Mar 20 17:35:19 crc kubenswrapper[4803]: I0320 17:35:19.519175 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" event={"ID":"43090279-19af-4393-ab9a-1092aae61875","Type":"ContainerStarted","Data":"82b42996b68aa2f7d3e531b321b80fccf0b0f3c4092a1512187b5fc910b3ba8e"} Mar 20 17:35:19 crc kubenswrapper[4803]: I0320 17:35:19.522757 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-648ff8c756-h8wjt" event={"ID":"deeb9bec-8bcb-48ef-ba67-b5772825f753","Type":"ContainerStarted","Data":"5a26cab21639a26486f3e8269bb1229e7c4652cef400d6ad18502f118df14682"} Mar 20 17:35:19 crc kubenswrapper[4803]: I0320 17:35:19.523857 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" event={"ID":"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8","Type":"ContainerStarted","Data":"eee53db6489df9fe96abbe84beca496ea288bb0bd9a833e98a8b8731ae42e73f"} Mar 20 17:35:19 crc kubenswrapper[4803]: I0320 17:35:19.702288 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-4nrzk"] Mar 20 17:35:19 crc kubenswrapper[4803]: E0320 17:35:19.715854 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" Mar 20 17:35:19 crc kubenswrapper[4803]: I0320 17:35:19.837568 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b99997f8b-mzkpj"] Mar 20 17:35:19 crc kubenswrapper[4803]: W0320 17:35:19.868069 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod726567e8_cdfd_4fe4_985f_1cf10a787994.slice/crio-8f16670d5d01e106f570803b9d203a2a9200cf6c6fc83d20ea644b7bb4d80be4 WatchSource:0}: Error finding container 8f16670d5d01e106f570803b9d203a2a9200cf6c6fc83d20ea644b7bb4d80be4: Status 404 returned error can't find the container with id 8f16670d5d01e106f570803b9d203a2a9200cf6c6fc83d20ea644b7bb4d80be4 Mar 20 17:35:19 crc kubenswrapper[4803]: I0320 17:35:19.899671 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cz888"] Mar 20 17:35:19 crc kubenswrapper[4803]: W0320 17:35:19.915863 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22a4b5ea_5c7b_4b27_bacf_806ef044a297.slice/crio-6165b5838dd6b0e6aac8645d3c0d43246de91f9f46c8d765aac5c8775147bc49 WatchSource:0}: Error finding container 6165b5838dd6b0e6aac8645d3c0d43246de91f9f46c8d765aac5c8775147bc49: Status 404 returned error can't find the container with id 6165b5838dd6b0e6aac8645d3c0d43246de91f9f46c8d765aac5c8775147bc49 Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.008064 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.036476 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.269679 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.534890 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40","Type":"ContainerStarted","Data":"0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.534987 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="ceilometer-notification-agent" containerID="cri-o://ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb" gracePeriod=30 Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.535068 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="sg-core" containerID="cri-o://48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d" gracePeriod=30 Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.535037 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="proxy-httpd" containerID="cri-o://0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29" gracePeriod=30 Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.535021 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.538615 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" event={"ID":"726567e8-cdfd-4fe4-985f-1cf10a787994","Type":"ContainerStarted","Data":"8f16670d5d01e106f570803b9d203a2a9200cf6c6fc83d20ea644b7bb4d80be4"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.540101 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2ca4624d-9e00-4f51-8046-a1a5b01abf31","Type":"ContainerStarted","Data":"1073d2e022852673493fb882091cc9b1cccb703074a6cb81a55f5acb396ecdba"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.542063 4803 generic.go:334] "Generic (PLEG): container finished" podID="0bf02d9a-4b01-46e2-b167-bd73b4ffa682" containerID="c9431794b22be8f21292a87e876651e362b58100ccc50123ea4f62992a32df5a" exitCode=0 Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.546307 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" event={"ID":"0bf02d9a-4b01-46e2-b167-bd73b4ffa682","Type":"ContainerDied","Data":"c9431794b22be8f21292a87e876651e362b58100ccc50123ea4f62992a32df5a"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.546339 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" event={"ID":"0bf02d9a-4b01-46e2-b167-bd73b4ffa682","Type":"ContainerStarted","Data":"ad9ecff8ca5924f17baee6be3e11799608fdbe52fa09d49bc5db6a60f62e9164"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.560153 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-648ff8c756-h8wjt" event={"ID":"deeb9bec-8bcb-48ef-ba67-b5772825f753","Type":"ContainerStarted","Data":"7b19cb9ab9f6813bb8a42c1b0914da3dfc664c02995fd0ce1d484c464c6a6efd"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.560512 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-648ff8c756-h8wjt" event={"ID":"deeb9bec-8bcb-48ef-ba67-b5772825f753","Type":"ContainerStarted","Data":"31a4220e4e1691122219b46298329b312891a12f2c0f54544e47ad9b5a2f1dfd"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.560699 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.560715 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.563769 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" event={"ID":"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8","Type":"ContainerStarted","Data":"f38c55f91047cc3d1c7fc29a49dce068912a14eb8b7ac9e771651313896d0771"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.563820 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" event={"ID":"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8","Type":"ContainerStarted","Data":"a48c60edc097556e8ec87b68958c92dbf2a28fcf65bdbf2d802c00855ab77d0a"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.564626 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.565210 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.570484 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"308526b2-6f01-494f-97c8-67e2af66a463","Type":"ContainerStarted","Data":"2901dc0e4c8be48371079e4a51c3c33e1eaaf2e10c67b5ea77968d135b4f98ce"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.599598 4803 generic.go:334] "Generic (PLEG): container finished" podID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" containerID="dcf1a30c9b0689299043ca01decfe0c25dc5ad1fe50242d439bd849ec0f2446f" exitCode=0 Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.599649 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cz888" event={"ID":"22a4b5ea-5c7b-4b27-bacf-806ef044a297","Type":"ContainerDied","Data":"dcf1a30c9b0689299043ca01decfe0c25dc5ad1fe50242d439bd849ec0f2446f"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.599676 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cz888" event={"ID":"22a4b5ea-5c7b-4b27-bacf-806ef044a297","Type":"ContainerStarted","Data":"6165b5838dd6b0e6aac8645d3c0d43246de91f9f46c8d765aac5c8775147bc49"} Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.682150 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-648ff8c756-h8wjt" podStartSLOduration=5.682127293 podStartE2EDuration="5.682127293s" podCreationTimestamp="2026-03-20 17:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:20.607565928 +0000 UTC m=+1130.519158018" watchObservedRunningTime="2026-03-20 17:35:20.682127293 +0000 UTC m=+1130.593719373" Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.693636 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podStartSLOduration=8.693621573 podStartE2EDuration="8.693621573s" podCreationTimestamp="2026-03-20 17:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:20.681242399 +0000 UTC m=+1130.592834499" watchObservedRunningTime="2026-03-20 17:35:20.693621573 +0000 UTC m=+1130.605213643" Mar 20 17:35:20 crc kubenswrapper[4803]: I0320 17:35:20.945371 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.011294 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-sb\") pod \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.011415 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-nb\") pod \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.011443 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-svc\") pod \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.011470 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n2hw\" (UniqueName: \"kubernetes.io/projected/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-kube-api-access-8n2hw\") pod \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.011507 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-swift-storage-0\") pod \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.011567 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-config\") pod \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\" (UID: \"0bf02d9a-4b01-46e2-b167-bd73b4ffa682\") " Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.017398 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-kube-api-access-8n2hw" (OuterVolumeSpecName: "kube-api-access-8n2hw") pod "0bf02d9a-4b01-46e2-b167-bd73b4ffa682" (UID: "0bf02d9a-4b01-46e2-b167-bd73b4ffa682"). InnerVolumeSpecName "kube-api-access-8n2hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.033745 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0bf02d9a-4b01-46e2-b167-bd73b4ffa682" (UID: "0bf02d9a-4b01-46e2-b167-bd73b4ffa682"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.036223 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0bf02d9a-4b01-46e2-b167-bd73b4ffa682" (UID: "0bf02d9a-4b01-46e2-b167-bd73b4ffa682"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.038949 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0bf02d9a-4b01-46e2-b167-bd73b4ffa682" (UID: "0bf02d9a-4b01-46e2-b167-bd73b4ffa682"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.039898 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-config" (OuterVolumeSpecName: "config") pod "0bf02d9a-4b01-46e2-b167-bd73b4ffa682" (UID: "0bf02d9a-4b01-46e2-b167-bd73b4ffa682"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.048626 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0bf02d9a-4b01-46e2-b167-bd73b4ffa682" (UID: "0bf02d9a-4b01-46e2-b167-bd73b4ffa682"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.113272 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.113306 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.113317 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.113328 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n2hw\" (UniqueName: \"kubernetes.io/projected/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-kube-api-access-8n2hw\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.113338 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.113347 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf02d9a-4b01-46e2-b167-bd73b4ffa682-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.613214 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cz888" event={"ID":"22a4b5ea-5c7b-4b27-bacf-806ef044a297","Type":"ContainerStarted","Data":"21db21b0f47fef938d7eda7d1ad82c56a83a469f378ee281fae2661ad2d28a7c"} Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.615160 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.618444 4803 generic.go:334] "Generic (PLEG): container finished" podID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerID="0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29" exitCode=0 Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.618488 4803 generic.go:334] "Generic (PLEG): container finished" podID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerID="48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d" exitCode=2 Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.618602 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40","Type":"ContainerDied","Data":"0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29"} Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.618644 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40","Type":"ContainerDied","Data":"48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d"} Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.622143 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2ca4624d-9e00-4f51-8046-a1a5b01abf31","Type":"ContainerStarted","Data":"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27"} Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.627859 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" event={"ID":"0bf02d9a-4b01-46e2-b167-bd73b4ffa682","Type":"ContainerDied","Data":"ad9ecff8ca5924f17baee6be3e11799608fdbe52fa09d49bc5db6a60f62e9164"} Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.627939 4803 scope.go:117] "RemoveContainer" containerID="c9431794b22be8f21292a87e876651e362b58100ccc50123ea4f62992a32df5a" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.627952 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-4nrzk" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.635558 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-cz888" podStartSLOduration=3.635537197 podStartE2EDuration="3.635537197s" podCreationTimestamp="2026-03-20 17:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:21.632832604 +0000 UTC m=+1131.544424684" watchObservedRunningTime="2026-03-20 17:35:21.635537197 +0000 UTC m=+1131.547129287" Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.730410 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-4nrzk"] Mar 20 17:35:21 crc kubenswrapper[4803]: I0320 17:35:21.738130 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-4nrzk"] Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.640191 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" event={"ID":"43090279-19af-4393-ab9a-1092aae61875","Type":"ContainerStarted","Data":"d5ad1953b5c199e0e2e5ff5ee405dce869f827e33cd5f677d052463deb4d7ed0"} Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.640891 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" event={"ID":"43090279-19af-4393-ab9a-1092aae61875","Type":"ContainerStarted","Data":"0c004d25183c3b5de49655af48769711762fb0cae454f415b0a316d91dd35771"} Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.642584 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" event={"ID":"726567e8-cdfd-4fe4-985f-1cf10a787994","Type":"ContainerStarted","Data":"22eb53369c4622c1052113de784a27243ee02e9e850282267e82a2f4a4c0e793"} Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.642630 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" event={"ID":"726567e8-cdfd-4fe4-985f-1cf10a787994","Type":"ContainerStarted","Data":"53a659dd80861142eb8950674792d1afc9eb64753f9e43751009e45f1882224a"} Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.644986 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"308526b2-6f01-494f-97c8-67e2af66a463","Type":"ContainerStarted","Data":"a4a638f11febcc5451d3de9db9cad3a64d2db28153134d5471967dfec9a3225c"} Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.647137 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2ca4624d-9e00-4f51-8046-a1a5b01abf31","Type":"ContainerStarted","Data":"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4"} Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.647331 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.647358 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerName="cinder-api-log" containerID="cri-o://ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27" gracePeriod=30 Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.647370 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerName="cinder-api" containerID="cri-o://3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4" gracePeriod=30 Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.662886 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6bfd9f5f6f-xmdsr" podStartSLOduration=8.29840166 podStartE2EDuration="10.66287104s" podCreationTimestamp="2026-03-20 17:35:12 +0000 UTC" firstStartedPulling="2026-03-20 17:35:19.431854234 +0000 UTC m=+1129.343446304" lastFinishedPulling="2026-03-20 17:35:21.796323614 +0000 UTC m=+1131.707915684" observedRunningTime="2026-03-20 17:35:22.657717011 +0000 UTC m=+1132.569309081" watchObservedRunningTime="2026-03-20 17:35:22.66287104 +0000 UTC m=+1132.574463100" Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.697962 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b99997f8b-mzkpj" podStartSLOduration=8.774669315 podStartE2EDuration="10.697945848s" podCreationTimestamp="2026-03-20 17:35:12 +0000 UTC" firstStartedPulling="2026-03-20 17:35:19.873188665 +0000 UTC m=+1129.784780735" lastFinishedPulling="2026-03-20 17:35:21.796465198 +0000 UTC m=+1131.708057268" observedRunningTime="2026-03-20 17:35:22.683549789 +0000 UTC m=+1132.595141879" watchObservedRunningTime="2026-03-20 17:35:22.697945848 +0000 UTC m=+1132.609537908" Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.729814 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.7297953889999995 podStartE2EDuration="4.729795389s" podCreationTimestamp="2026-03-20 17:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:22.705902153 +0000 UTC m=+1132.617494223" watchObservedRunningTime="2026-03-20 17:35:22.729795389 +0000 UTC m=+1132.641387459" Mar 20 17:35:22 crc kubenswrapper[4803]: I0320 17:35:22.870737 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf02d9a-4b01-46e2-b167-bd73b4ffa682" path="/var/lib/kubelet/pods/0bf02d9a-4b01-46e2-b167-bd73b4ffa682/volumes" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.358103 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.494685 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-combined-ca-bundle\") pod \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.494747 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca4624d-9e00-4f51-8046-a1a5b01abf31-etc-machine-id\") pod \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.494791 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfdkt\" (UniqueName: \"kubernetes.io/projected/2ca4624d-9e00-4f51-8046-a1a5b01abf31-kube-api-access-nfdkt\") pod \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.494836 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-scripts\") pod \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.494896 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ca4624d-9e00-4f51-8046-a1a5b01abf31-logs\") pod \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.494930 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data-custom\") pod \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.495132 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data\") pod \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\" (UID: \"2ca4624d-9e00-4f51-8046-a1a5b01abf31\") " Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.494825 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ca4624d-9e00-4f51-8046-a1a5b01abf31-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2ca4624d-9e00-4f51-8046-a1a5b01abf31" (UID: "2ca4624d-9e00-4f51-8046-a1a5b01abf31"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.495545 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ca4624d-9e00-4f51-8046-a1a5b01abf31-logs" (OuterVolumeSpecName: "logs") pod "2ca4624d-9e00-4f51-8046-a1a5b01abf31" (UID: "2ca4624d-9e00-4f51-8046-a1a5b01abf31"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.495603 4803 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2ca4624d-9e00-4f51-8046-a1a5b01abf31-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.501691 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-scripts" (OuterVolumeSpecName: "scripts") pod "2ca4624d-9e00-4f51-8046-a1a5b01abf31" (UID: "2ca4624d-9e00-4f51-8046-a1a5b01abf31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.501721 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca4624d-9e00-4f51-8046-a1a5b01abf31-kube-api-access-nfdkt" (OuterVolumeSpecName: "kube-api-access-nfdkt") pod "2ca4624d-9e00-4f51-8046-a1a5b01abf31" (UID: "2ca4624d-9e00-4f51-8046-a1a5b01abf31"). InnerVolumeSpecName "kube-api-access-nfdkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.509684 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2ca4624d-9e00-4f51-8046-a1a5b01abf31" (UID: "2ca4624d-9e00-4f51-8046-a1a5b01abf31"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.532685 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ca4624d-9e00-4f51-8046-a1a5b01abf31" (UID: "2ca4624d-9e00-4f51-8046-a1a5b01abf31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.567682 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data" (OuterVolumeSpecName: "config-data") pod "2ca4624d-9e00-4f51-8046-a1a5b01abf31" (UID: "2ca4624d-9e00-4f51-8046-a1a5b01abf31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.597343 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.597369 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.597381 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfdkt\" (UniqueName: \"kubernetes.io/projected/2ca4624d-9e00-4f51-8046-a1a5b01abf31-kube-api-access-nfdkt\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.597389 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.597398 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ca4624d-9e00-4f51-8046-a1a5b01abf31-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.597406 4803 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2ca4624d-9e00-4f51-8046-a1a5b01abf31-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.663109 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"308526b2-6f01-494f-97c8-67e2af66a463","Type":"ContainerStarted","Data":"d920dde73a7030816b48063af3829541840a9c22afb026187f4d65fcb5b6eb97"} Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.666181 4803 generic.go:334] "Generic (PLEG): container finished" podID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerID="3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4" exitCode=0 Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.666400 4803 generic.go:334] "Generic (PLEG): container finished" podID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerID="ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27" exitCode=143 Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.666485 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2ca4624d-9e00-4f51-8046-a1a5b01abf31","Type":"ContainerDied","Data":"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4"} Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.666515 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.666560 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2ca4624d-9e00-4f51-8046-a1a5b01abf31","Type":"ContainerDied","Data":"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27"} Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.666577 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2ca4624d-9e00-4f51-8046-a1a5b01abf31","Type":"ContainerDied","Data":"1073d2e022852673493fb882091cc9b1cccb703074a6cb81a55f5acb396ecdba"} Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.666595 4803 scope.go:117] "RemoveContainer" containerID="3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.684895 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.770779778 podStartE2EDuration="5.684877809s" podCreationTimestamp="2026-03-20 17:35:18 +0000 UTC" firstStartedPulling="2026-03-20 17:35:19.98694917 +0000 UTC m=+1129.898541230" lastFinishedPulling="2026-03-20 17:35:20.901047191 +0000 UTC m=+1130.812639261" observedRunningTime="2026-03-20 17:35:23.681760205 +0000 UTC m=+1133.593352275" watchObservedRunningTime="2026-03-20 17:35:23.684877809 +0000 UTC m=+1133.596469899" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.704699 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.714467 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.716767 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.731032 4803 scope.go:117] "RemoveContainer" containerID="ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.733162 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:23 crc kubenswrapper[4803]: E0320 17:35:23.733629 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerName="cinder-api" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.733648 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerName="cinder-api" Mar 20 17:35:23 crc kubenswrapper[4803]: E0320 17:35:23.733687 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerName="cinder-api-log" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.733694 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerName="cinder-api-log" Mar 20 17:35:23 crc kubenswrapper[4803]: E0320 17:35:23.733712 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf02d9a-4b01-46e2-b167-bd73b4ffa682" containerName="init" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.733717 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf02d9a-4b01-46e2-b167-bd73b4ffa682" containerName="init" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.733921 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerName="cinder-api-log" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.733946 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" containerName="cinder-api" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.733956 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf02d9a-4b01-46e2-b167-bd73b4ffa682" containerName="init" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.734979 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.745337 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.748046 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.748242 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.748790 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.753875 4803 scope.go:117] "RemoveContainer" containerID="3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4" Mar 20 17:35:23 crc kubenswrapper[4803]: E0320 17:35:23.755142 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4\": container with ID starting with 3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4 not found: ID does not exist" containerID="3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.755167 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4"} err="failed to get container status \"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4\": rpc error: code = NotFound desc = could not find container \"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4\": container with ID starting with 3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4 not found: ID does not exist" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.755184 4803 scope.go:117] "RemoveContainer" containerID="ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27" Mar 20 17:35:23 crc kubenswrapper[4803]: E0320 17:35:23.756464 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27\": container with ID starting with ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27 not found: ID does not exist" containerID="ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.756489 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27"} err="failed to get container status \"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27\": rpc error: code = NotFound desc = could not find container \"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27\": container with ID starting with ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27 not found: ID does not exist" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.756502 4803 scope.go:117] "RemoveContainer" containerID="3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.756782 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4"} err="failed to get container status \"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4\": rpc error: code = NotFound desc = could not find container \"3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4\": container with ID starting with 3864f6f2e344686296eef4e83ebd6358c6758aefbe02a9c8b47d2643a4a7f9b4 not found: ID does not exist" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.756811 4803 scope.go:117] "RemoveContainer" containerID="ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.757128 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27"} err="failed to get container status \"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27\": rpc error: code = NotFound desc = could not find container \"ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27\": container with ID starting with ced7d12ecb46c947165884d677872374547b79c6bf0999101568c211bfc42d27 not found: ID does not exist" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.799911 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035c521a-8bdf-4489-a429-3629df54ca84-logs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.799950 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.799983 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-config-data-custom\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.800015 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/035c521a-8bdf-4489-a429-3629df54ca84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.800064 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-scripts\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.800122 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-public-tls-certs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.800157 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-config-data\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.800192 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hcnk\" (UniqueName: \"kubernetes.io/projected/035c521a-8bdf-4489-a429-3629df54ca84-kube-api-access-2hcnk\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.800211 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.865379 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.879053 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.901921 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-public-tls-certs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.901992 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-config-data\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.902048 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hcnk\" (UniqueName: \"kubernetes.io/projected/035c521a-8bdf-4489-a429-3629df54ca84-kube-api-access-2hcnk\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.902076 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.902161 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035c521a-8bdf-4489-a429-3629df54ca84-logs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.902213 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.902252 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-config-data-custom\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.902286 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/035c521a-8bdf-4489-a429-3629df54ca84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.902336 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-scripts\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.904205 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035c521a-8bdf-4489-a429-3629df54ca84-logs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.904269 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/035c521a-8bdf-4489-a429-3629df54ca84-etc-machine-id\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.909331 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-config-data-custom\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.910007 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-config-data\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.910795 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-scripts\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.914886 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.921033 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-public-tls-certs\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.922029 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035c521a-8bdf-4489-a429-3629df54ca84-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.924199 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hcnk\" (UniqueName: \"kubernetes.io/projected/035c521a-8bdf-4489-a429-3629df54ca84-kube-api-access-2hcnk\") pod \"cinder-api-0\" (UID: \"035c521a-8bdf-4489-a429-3629df54ca84\") " pod="openstack/cinder-api-0" Mar 20 17:35:23 crc kubenswrapper[4803]: I0320 17:35:23.983873 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.051086 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.105889 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-config-data\") pod \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.106036 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-sg-core-conf-yaml\") pod \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.106167 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-scripts\") pod \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.106199 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-combined-ca-bundle\") pod \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.106265 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-log-httpd\") pod \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.106336 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9whv\" (UniqueName: \"kubernetes.io/projected/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-kube-api-access-j9whv\") pod \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.106372 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-run-httpd\") pod \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\" (UID: \"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40\") " Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.107245 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" (UID: "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.108085 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" (UID: "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.110453 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-kube-api-access-j9whv" (OuterVolumeSpecName: "kube-api-access-j9whv") pod "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" (UID: "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40"). InnerVolumeSpecName "kube-api-access-j9whv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.113624 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-scripts" (OuterVolumeSpecName: "scripts") pod "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" (UID: "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.164331 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" (UID: "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.202816 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" (UID: "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.210935 4803 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.210966 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.210981 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.210989 4803 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.210998 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9whv\" (UniqueName: \"kubernetes.io/projected/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-kube-api-access-j9whv\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.211011 4803 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.242743 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-config-data" (OuterVolumeSpecName: "config-data") pod "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" (UID: "7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.312553 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.491584 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 17:35:24 crc kubenswrapper[4803]: W0320 17:35:24.498436 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod035c521a_8bdf_4489_a429_3629df54ca84.slice/crio-1d629b8caa66a3ef86bf4388e2864dbd84c0d323fee8308a801439dd07605a05 WatchSource:0}: Error finding container 1d629b8caa66a3ef86bf4388e2864dbd84c0d323fee8308a801439dd07605a05: Status 404 returned error can't find the container with id 1d629b8caa66a3ef86bf4388e2864dbd84c0d323fee8308a801439dd07605a05 Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.558437 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.757755 4803 generic.go:334] "Generic (PLEG): container finished" podID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerID="ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb" exitCode=0 Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.757844 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40","Type":"ContainerDied","Data":"ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb"} Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.757870 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40","Type":"ContainerDied","Data":"d0cf40118b694a5b1c4dae13e3ac02b7e1fb3f036a87b457101598dc5c0f053f"} Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.757887 4803 scope.go:117] "RemoveContainer" containerID="0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.758031 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.804063 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"035c521a-8bdf-4489-a429-3629df54ca84","Type":"ContainerStarted","Data":"1d629b8caa66a3ef86bf4388e2864dbd84c0d323fee8308a801439dd07605a05"} Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.898801 4803 scope.go:117] "RemoveContainer" containerID="48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.921768 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca4624d-9e00-4f51-8046-a1a5b01abf31" path="/var/lib/kubelet/pods/2ca4624d-9e00-4f51-8046-a1a5b01abf31/volumes" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.922598 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.925595 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.948747 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c76885c69-tg6gl"] Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.949005 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c76885c69-tg6gl" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-api" containerID="cri-o://0ded043744782977e6c1dc9d31e30bbcb0eeaec113bae6bf8f1a46cd831b848a" gracePeriod=30 Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.949647 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c76885c69-tg6gl" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-httpd" containerID="cri-o://96a6e4bc9c87ace754b9960b446e01b6d5898a11961cc4f8d2b48a5403ef2751" gracePeriod=30 Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.971613 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:24 crc kubenswrapper[4803]: E0320 17:35:24.972030 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="proxy-httpd" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.972047 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="proxy-httpd" Mar 20 17:35:24 crc kubenswrapper[4803]: E0320 17:35:24.972058 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="sg-core" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.972066 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="sg-core" Mar 20 17:35:24 crc kubenswrapper[4803]: E0320 17:35:24.972093 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="ceilometer-notification-agent" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.972100 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="ceilometer-notification-agent" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.972271 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="ceilometer-notification-agent" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.972295 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="sg-core" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.972305 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" containerName="proxy-httpd" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.973877 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.979401 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.980375 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.986730 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.990755 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7c76885c69-tg6gl" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": EOF" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.992659 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6b69588b57-phch4"] Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.996465 4803 scope.go:117] "RemoveContainer" containerID="ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb" Mar 20 17:35:24 crc kubenswrapper[4803]: I0320 17:35:24.998288 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.002310 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b69588b57-phch4"] Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.033545 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-log-httpd\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.033629 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-scripts\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.033705 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cngv5\" (UniqueName: \"kubernetes.io/projected/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-kube-api-access-cngv5\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.033797 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-config-data\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.033872 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.034061 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-run-httpd\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.034280 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.060744 4803 scope.go:117] "RemoveContainer" containerID="0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29" Mar 20 17:35:25 crc kubenswrapper[4803]: E0320 17:35:25.064907 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29\": container with ID starting with 0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29 not found: ID does not exist" containerID="0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.064942 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29"} err="failed to get container status \"0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29\": rpc error: code = NotFound desc = could not find container \"0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29\": container with ID starting with 0d33e2f3a9c39d0629c7ed679dc6f98bcb63864760a37e9286e0c5475a913d29 not found: ID does not exist" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.064967 4803 scope.go:117] "RemoveContainer" containerID="48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d" Mar 20 17:35:25 crc kubenswrapper[4803]: E0320 17:35:25.065292 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d\": container with ID starting with 48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d not found: ID does not exist" containerID="48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.065315 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d"} err="failed to get container status \"48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d\": rpc error: code = NotFound desc = could not find container \"48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d\": container with ID starting with 48a294395c22d875763c59b08a1eed2775e773594c0b0730f4d4bf585cd13e9d not found: ID does not exist" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.065328 4803 scope.go:117] "RemoveContainer" containerID="ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb" Mar 20 17:35:25 crc kubenswrapper[4803]: E0320 17:35:25.065563 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb\": container with ID starting with ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb not found: ID does not exist" containerID="ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.065605 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb"} err="failed to get container status \"ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb\": rpc error: code = NotFound desc = could not find container \"ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb\": container with ID starting with ad79a67e003f7c206380d16dbf1bdd2ac6eadf958fbe19877463e6b6ff1330bb not found: ID does not exist" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136303 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136362 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-combined-ca-bundle\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136387 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-log-httpd\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136461 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-internal-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136486 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-scripts\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136505 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-httpd-config\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136580 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cngv5\" (UniqueName: \"kubernetes.io/projected/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-kube-api-access-cngv5\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136609 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-config-data\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136638 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136656 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-public-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136676 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-run-httpd\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136696 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-config\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136732 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-ovndb-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.136762 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnsf\" (UniqueName: \"kubernetes.io/projected/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-kube-api-access-mwnsf\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.137989 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-log-httpd\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.138110 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-run-httpd\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.143226 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.143551 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-config-data\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.145949 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.149325 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-scripts\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.153111 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cngv5\" (UniqueName: \"kubernetes.io/projected/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-kube-api-access-cngv5\") pod \"ceilometer-0\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.238630 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-httpd-config\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.238739 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-public-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.238782 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-config\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.238894 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-ovndb-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.238922 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnsf\" (UniqueName: \"kubernetes.io/projected/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-kube-api-access-mwnsf\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.238968 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-combined-ca-bundle\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.239036 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-internal-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.244582 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-httpd-config\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.245781 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-public-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.245932 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-ovndb-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.246144 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-internal-tls-certs\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.250044 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-config\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.256264 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnsf\" (UniqueName: \"kubernetes.io/projected/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-kube-api-access-mwnsf\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.257281 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6-combined-ca-bundle\") pod \"neutron-6b69588b57-phch4\" (UID: \"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6\") " pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.349049 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.362138 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.846039 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"035c521a-8bdf-4489-a429-3629df54ca84","Type":"ContainerStarted","Data":"aa60122302ac4a7f8257571d0c104df579dc6a2e839c09a6db587d6128a9049f"} Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.870336 4803 generic.go:334] "Generic (PLEG): container finished" podID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerID="96a6e4bc9c87ace754b9960b446e01b6d5898a11961cc4f8d2b48a5403ef2751" exitCode=0 Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.871240 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c76885c69-tg6gl" event={"ID":"6570dd6e-761d-4d3d-99e8-5dfaba169520","Type":"ContainerDied","Data":"96a6e4bc9c87ace754b9960b446e01b6d5898a11961cc4f8d2b48a5403ef2751"} Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.878765 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-596cfc5b56-w5pbk" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.955897 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:35:25 crc kubenswrapper[4803]: I0320 17:35:25.968010 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.001344 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75b65b9966-sh4pn"] Mar 20 17:35:26 crc kubenswrapper[4803]: W0320 17:35:26.149177 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a750f8a_dfa6_41fa_91c0_8d5c4eb8feb6.slice/crio-f5c730b1d24a03d9675aa42622ac82c77d863ec57436297961709537303270fe WatchSource:0}: Error finding container f5c730b1d24a03d9675aa42622ac82c77d863ec57436297961709537303270fe: Status 404 returned error can't find the container with id f5c730b1d24a03d9675aa42622ac82c77d863ec57436297961709537303270fe Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.150657 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6b69588b57-phch4"] Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.359375 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7c76885c69-tg6gl" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": dial tcp 10.217.0.160:9696: connect: connection refused" Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.858248 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40" path="/var/lib/kubelet/pods/7b607dc7-b1a9-4b55-acfb-4e3a4b55fe40/volumes" Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.878835 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"035c521a-8bdf-4489-a429-3629df54ca84","Type":"ContainerStarted","Data":"f090575ada9dba83503a4fa96b8bf2b2c5783ba17450abe591a64cae3df1a774"} Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.879152 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.880808 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b69588b57-phch4" event={"ID":"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6","Type":"ContainerStarted","Data":"066f43a68207352ea615d8404a82eb7769c413ae4ae89c5e25ca35464aff4ccb"} Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.880833 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b69588b57-phch4" event={"ID":"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6","Type":"ContainerStarted","Data":"9e1fe4e5c258fbb3179fd74d0df525e481c296f26e6cbf5bd8ac37107687219e"} Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.880844 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6b69588b57-phch4" event={"ID":"9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6","Type":"ContainerStarted","Data":"f5c730b1d24a03d9675aa42622ac82c77d863ec57436297961709537303270fe"} Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.880947 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.883740 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75b65b9966-sh4pn" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon-log" containerID="cri-o://118e9705aaa8a134f7b51ee635781e210861eb775bae53298f2aeeb57f80314c" gracePeriod=30 Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.884068 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerStarted","Data":"bff37b47bc7555f549b0c6203954ec0d02e36d7a6dba536236b062330184be15"} Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.884097 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerStarted","Data":"6604352edbf5672477a53f226bfa7a69c5155e198737ed69726600ee44b8d7fe"} Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.884135 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-75b65b9966-sh4pn" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon" containerID="cri-o://5f8eee39cb1e2bae324df0b81bafc95c7eb474ef6d9cd9db257fe0d718615d13" gracePeriod=30 Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.914579 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.914562049 podStartE2EDuration="3.914562049s" podCreationTimestamp="2026-03-20 17:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:26.900117478 +0000 UTC m=+1136.811709548" watchObservedRunningTime="2026-03-20 17:35:26.914562049 +0000 UTC m=+1136.826154119" Mar 20 17:35:26 crc kubenswrapper[4803]: I0320 17:35:26.923830 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6b69588b57-phch4" podStartSLOduration=2.923814159 podStartE2EDuration="2.923814159s" podCreationTimestamp="2026-03-20 17:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:26.923004027 +0000 UTC m=+1136.834596107" watchObservedRunningTime="2026-03-20 17:35:26.923814159 +0000 UTC m=+1136.835406239" Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.305743 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.368446 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-648ff8c756-h8wjt" Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.430210 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-68dc9dd7c8-7lvg9"] Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.430739 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api-log" containerID="cri-o://a48c60edc097556e8ec87b68958c92dbf2a28fcf65bdbf2d802c00855ab77d0a" gracePeriod=30 Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.431635 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api" containerID="cri-o://f38c55f91047cc3d1c7fc29a49dce068912a14eb8b7ac9e771651313896d0771" gracePeriod=30 Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.444716 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": EOF" Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.444919 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": EOF" Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.895921 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerStarted","Data":"4b7680dd7ce3f95d2239bf669dacd6a291aef4b734cafb33bbfd0cb3b924ad73"} Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.901367 4803 generic.go:334] "Generic (PLEG): container finished" podID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerID="0ded043744782977e6c1dc9d31e30bbcb0eeaec113bae6bf8f1a46cd831b848a" exitCode=0 Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.901454 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c76885c69-tg6gl" event={"ID":"6570dd6e-761d-4d3d-99e8-5dfaba169520","Type":"ContainerDied","Data":"0ded043744782977e6c1dc9d31e30bbcb0eeaec113bae6bf8f1a46cd831b848a"} Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.903960 4803 generic.go:334] "Generic (PLEG): container finished" podID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerID="a48c60edc097556e8ec87b68958c92dbf2a28fcf65bdbf2d802c00855ab77d0a" exitCode=143 Mar 20 17:35:27 crc kubenswrapper[4803]: I0320 17:35:27.904783 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" event={"ID":"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8","Type":"ContainerDied","Data":"a48c60edc097556e8ec87b68958c92dbf2a28fcf65bdbf2d802c00855ab77d0a"} Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.439591 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.510500 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-public-tls-certs\") pod \"6570dd6e-761d-4d3d-99e8-5dfaba169520\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.510845 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-internal-tls-certs\") pod \"6570dd6e-761d-4d3d-99e8-5dfaba169520\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.510872 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlqx9\" (UniqueName: \"kubernetes.io/projected/6570dd6e-761d-4d3d-99e8-5dfaba169520-kube-api-access-jlqx9\") pod \"6570dd6e-761d-4d3d-99e8-5dfaba169520\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.510909 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-httpd-config\") pod \"6570dd6e-761d-4d3d-99e8-5dfaba169520\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.510949 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-ovndb-tls-certs\") pod \"6570dd6e-761d-4d3d-99e8-5dfaba169520\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.511015 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-config\") pod \"6570dd6e-761d-4d3d-99e8-5dfaba169520\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.511045 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-combined-ca-bundle\") pod \"6570dd6e-761d-4d3d-99e8-5dfaba169520\" (UID: \"6570dd6e-761d-4d3d-99e8-5dfaba169520\") " Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.521834 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6570dd6e-761d-4d3d-99e8-5dfaba169520" (UID: "6570dd6e-761d-4d3d-99e8-5dfaba169520"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.533232 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6570dd6e-761d-4d3d-99e8-5dfaba169520-kube-api-access-jlqx9" (OuterVolumeSpecName: "kube-api-access-jlqx9") pod "6570dd6e-761d-4d3d-99e8-5dfaba169520" (UID: "6570dd6e-761d-4d3d-99e8-5dfaba169520"). InnerVolumeSpecName "kube-api-access-jlqx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.557733 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-config" (OuterVolumeSpecName: "config") pod "6570dd6e-761d-4d3d-99e8-5dfaba169520" (UID: "6570dd6e-761d-4d3d-99e8-5dfaba169520"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.567731 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6570dd6e-761d-4d3d-99e8-5dfaba169520" (UID: "6570dd6e-761d-4d3d-99e8-5dfaba169520"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.571702 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6570dd6e-761d-4d3d-99e8-5dfaba169520" (UID: "6570dd6e-761d-4d3d-99e8-5dfaba169520"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.598610 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6570dd6e-761d-4d3d-99e8-5dfaba169520" (UID: "6570dd6e-761d-4d3d-99e8-5dfaba169520"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.612606 4803 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.612630 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.612640 4803 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.612651 4803 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.612661 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlqx9\" (UniqueName: \"kubernetes.io/projected/6570dd6e-761d-4d3d-99e8-5dfaba169520-kube-api-access-jlqx9\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.612672 4803 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.612782 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6570dd6e-761d-4d3d-99e8-5dfaba169520" (UID: "6570dd6e-761d-4d3d-99e8-5dfaba169520"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.714015 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6570dd6e-761d-4d3d-99e8-5dfaba169520-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.843732 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.949275 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerStarted","Data":"936bb04a57759b534c0f972fdb52034feee0bcf9146e96363e79ff604490434a"} Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.951368 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c76885c69-tg6gl" event={"ID":"6570dd6e-761d-4d3d-99e8-5dfaba169520","Type":"ContainerDied","Data":"de3c35d0ca5dc5f020a51eb767b69c50a6905108c2766cc71b3e81988994154f"} Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.951396 4803 scope.go:117] "RemoveContainer" containerID="96a6e4bc9c87ace754b9960b446e01b6d5898a11961cc4f8d2b48a5403ef2751" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.951515 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c76885c69-tg6gl" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.957151 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-mh5hz"] Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.957361 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" podUID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerName="dnsmasq-dns" containerID="cri-o://1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6" gracePeriod=10 Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.988292 4803 scope.go:117] "RemoveContainer" containerID="0ded043744782977e6c1dc9d31e30bbcb0eeaec113bae6bf8f1a46cd831b848a" Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.988692 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c76885c69-tg6gl"] Mar 20 17:35:28 crc kubenswrapper[4803]: I0320 17:35:28.999472 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c76885c69-tg6gl"] Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.125898 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.171222 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.554839 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.647340 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-config\") pod \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.648036 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-nb\") pod \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.648220 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-swift-storage-0\") pod \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.648276 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7b88\" (UniqueName: \"kubernetes.io/projected/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-kube-api-access-r7b88\") pod \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.648358 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-svc\") pod \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.648442 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-sb\") pod \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\" (UID: \"ed2a5efb-ed66-4e83-96e9-12ae1bf75905\") " Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.668754 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-kube-api-access-r7b88" (OuterVolumeSpecName: "kube-api-access-r7b88") pod "ed2a5efb-ed66-4e83-96e9-12ae1bf75905" (UID: "ed2a5efb-ed66-4e83-96e9-12ae1bf75905"). InnerVolumeSpecName "kube-api-access-r7b88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.701555 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ed2a5efb-ed66-4e83-96e9-12ae1bf75905" (UID: "ed2a5efb-ed66-4e83-96e9-12ae1bf75905"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.704331 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ed2a5efb-ed66-4e83-96e9-12ae1bf75905" (UID: "ed2a5efb-ed66-4e83-96e9-12ae1bf75905"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.705839 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ed2a5efb-ed66-4e83-96e9-12ae1bf75905" (UID: "ed2a5efb-ed66-4e83-96e9-12ae1bf75905"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.708499 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ed2a5efb-ed66-4e83-96e9-12ae1bf75905" (UID: "ed2a5efb-ed66-4e83-96e9-12ae1bf75905"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.736029 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-config" (OuterVolumeSpecName: "config") pod "ed2a5efb-ed66-4e83-96e9-12ae1bf75905" (UID: "ed2a5efb-ed66-4e83-96e9-12ae1bf75905"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.751092 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.751123 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.751132 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.751141 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.751150 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7b88\" (UniqueName: \"kubernetes.io/projected/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-kube-api-access-r7b88\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.751160 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ed2a5efb-ed66-4e83-96e9-12ae1bf75905-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.959737 4803 generic.go:334] "Generic (PLEG): container finished" podID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerID="1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6" exitCode=0 Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.959788 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" event={"ID":"ed2a5efb-ed66-4e83-96e9-12ae1bf75905","Type":"ContainerDied","Data":"1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6"} Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.959811 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" event={"ID":"ed2a5efb-ed66-4e83-96e9-12ae1bf75905","Type":"ContainerDied","Data":"7b5232fa19e4da138a626ca0344be4d0ddc2f28144ddb103c25350a02af16e63"} Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.959828 4803 scope.go:117] "RemoveContainer" containerID="1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.959899 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.971570 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="308526b2-6f01-494f-97c8-67e2af66a463" containerName="probe" containerID="cri-o://d920dde73a7030816b48063af3829541840a9c22afb026187f4d65fcb5b6eb97" gracePeriod=30 Mar 20 17:35:29 crc kubenswrapper[4803]: I0320 17:35:29.971572 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="308526b2-6f01-494f-97c8-67e2af66a463" containerName="cinder-scheduler" containerID="cri-o://a4a638f11febcc5451d3de9db9cad3a64d2db28153134d5471967dfec9a3225c" gracePeriod=30 Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.003771 4803 scope.go:117] "RemoveContainer" containerID="c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01" Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.003874 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-mh5hz"] Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.013854 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-mh5hz"] Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.036650 4803 scope.go:117] "RemoveContainer" containerID="1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6" Mar 20 17:35:30 crc kubenswrapper[4803]: E0320 17:35:30.037100 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6\": container with ID starting with 1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6 not found: ID does not exist" containerID="1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6" Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.037146 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6"} err="failed to get container status \"1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6\": rpc error: code = NotFound desc = could not find container \"1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6\": container with ID starting with 1a5372e2bbba60e7068bcf87987be98a818da4ba34398bf9e3abbc93a138a4c6 not found: ID does not exist" Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.037172 4803 scope.go:117] "RemoveContainer" containerID="c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01" Mar 20 17:35:30 crc kubenswrapper[4803]: E0320 17:35:30.037470 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01\": container with ID starting with c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01 not found: ID does not exist" containerID="c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01" Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.037499 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01"} err="failed to get container status \"c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01\": rpc error: code = NotFound desc = could not find container \"c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01\": container with ID starting with c0e9c84ab809e2ed3469e76856a7db4857c97818b208f35d9f65e26229c34a01 not found: ID does not exist" Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.883241 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" path="/var/lib/kubelet/pods/6570dd6e-761d-4d3d-99e8-5dfaba169520/volumes" Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.885288 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" path="/var/lib/kubelet/pods/ed2a5efb-ed66-4e83-96e9-12ae1bf75905/volumes" Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.982167 4803 generic.go:334] "Generic (PLEG): container finished" podID="308526b2-6f01-494f-97c8-67e2af66a463" containerID="d920dde73a7030816b48063af3829541840a9c22afb026187f4d65fcb5b6eb97" exitCode=0 Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.982233 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"308526b2-6f01-494f-97c8-67e2af66a463","Type":"ContainerDied","Data":"d920dde73a7030816b48063af3829541840a9c22afb026187f4d65fcb5b6eb97"} Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.984787 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerStarted","Data":"af3018a478f58bc49449e6e7cba6f378481703f4bc819c22afd93d43e3228f30"} Mar 20 17:35:30 crc kubenswrapper[4803]: I0320 17:35:30.985990 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:35:31 crc kubenswrapper[4803]: I0320 17:35:31.002793 4803 generic.go:334] "Generic (PLEG): container finished" podID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerID="5f8eee39cb1e2bae324df0b81bafc95c7eb474ef6d9cd9db257fe0d718615d13" exitCode=0 Mar 20 17:35:31 crc kubenswrapper[4803]: I0320 17:35:31.003077 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b65b9966-sh4pn" event={"ID":"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed","Type":"ContainerDied","Data":"5f8eee39cb1e2bae324df0b81bafc95c7eb474ef6d9cd9db257fe0d718615d13"} Mar 20 17:35:31 crc kubenswrapper[4803]: I0320 17:35:31.019021 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.326604204 podStartE2EDuration="7.019003227s" podCreationTimestamp="2026-03-20 17:35:24 +0000 UTC" firstStartedPulling="2026-03-20 17:35:25.970935179 +0000 UTC m=+1135.882527249" lastFinishedPulling="2026-03-20 17:35:30.663334172 +0000 UTC m=+1140.574926272" observedRunningTime="2026-03-20 17:35:31.01762424 +0000 UTC m=+1140.929216380" watchObservedRunningTime="2026-03-20 17:35:31.019003227 +0000 UTC m=+1140.930595297" Mar 20 17:35:31 crc kubenswrapper[4803]: I0320 17:35:31.662646 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75b65b9966-sh4pn" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 20 17:35:32 crc kubenswrapper[4803]: I0320 17:35:32.851383 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:52444->10.217.0.167:9311: read: connection reset by peer" Mar 20 17:35:32 crc kubenswrapper[4803]: I0320 17:35:32.851466 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:52446->10.217.0.167:9311: read: connection reset by peer" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.023302 4803 generic.go:334] "Generic (PLEG): container finished" podID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerID="f38c55f91047cc3d1c7fc29a49dce068912a14eb8b7ac9e771651313896d0771" exitCode=0 Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.023374 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" event={"ID":"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8","Type":"ContainerDied","Data":"f38c55f91047cc3d1c7fc29a49dce068912a14eb8b7ac9e771651313896d0771"} Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.355054 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.419497 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-combined-ca-bundle\") pod \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.419640 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-logs\") pod \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.419693 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data-custom\") pod \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.419715 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8njt\" (UniqueName: \"kubernetes.io/projected/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-kube-api-access-x8njt\") pod \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.419806 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data\") pod \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\" (UID: \"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8\") " Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.420081 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-logs" (OuterVolumeSpecName: "logs") pod "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" (UID: "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.420411 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.425640 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" (UID: "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.425795 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-kube-api-access-x8njt" (OuterVolumeSpecName: "kube-api-access-x8njt") pod "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" (UID: "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8"). InnerVolumeSpecName "kube-api-access-x8njt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.445044 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" (UID: "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.473671 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data" (OuterVolumeSpecName: "config-data") pod "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" (UID: "6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.522245 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.522285 4803 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.522299 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8njt\" (UniqueName: \"kubernetes.io/projected/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-kube-api-access-x8njt\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:33 crc kubenswrapper[4803]: I0320 17:35:33.522312 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.036636 4803 generic.go:334] "Generic (PLEG): container finished" podID="308526b2-6f01-494f-97c8-67e2af66a463" containerID="a4a638f11febcc5451d3de9db9cad3a64d2db28153134d5471967dfec9a3225c" exitCode=0 Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.036679 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"308526b2-6f01-494f-97c8-67e2af66a463","Type":"ContainerDied","Data":"a4a638f11febcc5451d3de9db9cad3a64d2db28153134d5471967dfec9a3225c"} Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.039330 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" event={"ID":"6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8","Type":"ContainerDied","Data":"eee53db6489df9fe96abbe84beca496ea288bb0bd9a833e98a8b8731ae42e73f"} Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.039438 4803 scope.go:117] "RemoveContainer" containerID="f38c55f91047cc3d1c7fc29a49dce068912a14eb8b7ac9e771651313896d0771" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.039378 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.073975 4803 scope.go:117] "RemoveContainer" containerID="a48c60edc097556e8ec87b68958c92dbf2a28fcf65bdbf2d802c00855ab77d0a" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.074059 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-68dc9dd7c8-7lvg9"] Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.082516 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-68dc9dd7c8-7lvg9"] Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.327930 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7b667979-mh5hz" podUID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.158:5353: i/o timeout" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.367767 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.455509 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-scripts\") pod \"308526b2-6f01-494f-97c8-67e2af66a463\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.455910 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data\") pod \"308526b2-6f01-494f-97c8-67e2af66a463\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.456048 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-combined-ca-bundle\") pod \"308526b2-6f01-494f-97c8-67e2af66a463\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.456142 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/308526b2-6f01-494f-97c8-67e2af66a463-etc-machine-id\") pod \"308526b2-6f01-494f-97c8-67e2af66a463\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.456288 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data-custom\") pod \"308526b2-6f01-494f-97c8-67e2af66a463\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.456383 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlkhf\" (UniqueName: \"kubernetes.io/projected/308526b2-6f01-494f-97c8-67e2af66a463-kube-api-access-rlkhf\") pod \"308526b2-6f01-494f-97c8-67e2af66a463\" (UID: \"308526b2-6f01-494f-97c8-67e2af66a463\") " Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.456408 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/308526b2-6f01-494f-97c8-67e2af66a463-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "308526b2-6f01-494f-97c8-67e2af66a463" (UID: "308526b2-6f01-494f-97c8-67e2af66a463"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.462008 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308526b2-6f01-494f-97c8-67e2af66a463-kube-api-access-rlkhf" (OuterVolumeSpecName: "kube-api-access-rlkhf") pod "308526b2-6f01-494f-97c8-67e2af66a463" (UID: "308526b2-6f01-494f-97c8-67e2af66a463"). InnerVolumeSpecName "kube-api-access-rlkhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.462134 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-scripts" (OuterVolumeSpecName: "scripts") pod "308526b2-6f01-494f-97c8-67e2af66a463" (UID: "308526b2-6f01-494f-97c8-67e2af66a463"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.462852 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "308526b2-6f01-494f-97c8-67e2af66a463" (UID: "308526b2-6f01-494f-97c8-67e2af66a463"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.520821 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "308526b2-6f01-494f-97c8-67e2af66a463" (UID: "308526b2-6f01-494f-97c8-67e2af66a463"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.558593 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.558622 4803 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/308526b2-6f01-494f-97c8-67e2af66a463-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.558632 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.558641 4803 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.558649 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlkhf\" (UniqueName: \"kubernetes.io/projected/308526b2-6f01-494f-97c8-67e2af66a463-kube-api-access-rlkhf\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.567678 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data" (OuterVolumeSpecName: "config-data") pod "308526b2-6f01-494f-97c8-67e2af66a463" (UID: "308526b2-6f01-494f-97c8-67e2af66a463"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.660758 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/308526b2-6f01-494f-97c8-67e2af66a463-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:34 crc kubenswrapper[4803]: I0320 17:35:34.865221 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" path="/var/lib/kubelet/pods/6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8/volumes" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.050572 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"308526b2-6f01-494f-97c8-67e2af66a463","Type":"ContainerDied","Data":"2901dc0e4c8be48371079e4a51c3c33e1eaaf2e10c67b5ea77968d135b4f98ce"} Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.050614 4803 scope.go:117] "RemoveContainer" containerID="d920dde73a7030816b48063af3829541840a9c22afb026187f4d65fcb5b6eb97" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.050742 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.086996 4803 scope.go:117] "RemoveContainer" containerID="a4a638f11febcc5451d3de9db9cad3a64d2db28153134d5471967dfec9a3225c" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.099331 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.099388 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.120673 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:35 crc kubenswrapper[4803]: E0320 17:35:35.121066 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-httpd" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121083 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-httpd" Mar 20 17:35:35 crc kubenswrapper[4803]: E0320 17:35:35.121101 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-api" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121107 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-api" Mar 20 17:35:35 crc kubenswrapper[4803]: E0320 17:35:35.121121 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308526b2-6f01-494f-97c8-67e2af66a463" containerName="cinder-scheduler" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121127 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="308526b2-6f01-494f-97c8-67e2af66a463" containerName="cinder-scheduler" Mar 20 17:35:35 crc kubenswrapper[4803]: E0320 17:35:35.121142 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121148 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api" Mar 20 17:35:35 crc kubenswrapper[4803]: E0320 17:35:35.121159 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerName="dnsmasq-dns" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121168 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerName="dnsmasq-dns" Mar 20 17:35:35 crc kubenswrapper[4803]: E0320 17:35:35.121186 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api-log" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121194 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api-log" Mar 20 17:35:35 crc kubenswrapper[4803]: E0320 17:35:35.121207 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="308526b2-6f01-494f-97c8-67e2af66a463" containerName="probe" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121213 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="308526b2-6f01-494f-97c8-67e2af66a463" containerName="probe" Mar 20 17:35:35 crc kubenswrapper[4803]: E0320 17:35:35.121227 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerName="init" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121234 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerName="init" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121432 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-httpd" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121441 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="308526b2-6f01-494f-97c8-67e2af66a463" containerName="probe" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121448 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="308526b2-6f01-494f-97c8-67e2af66a463" containerName="cinder-scheduler" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121459 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed2a5efb-ed66-4e83-96e9-12ae1bf75905" containerName="dnsmasq-dns" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121471 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api-log" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121480 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6570dd6e-761d-4d3d-99e8-5dfaba169520" containerName="neutron-api" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.121494 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.122423 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.125138 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.141742 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.183475 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.183976 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.184050 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-848zr\" (UniqueName: \"kubernetes.io/projected/78208796-d6b5-472e-9f4b-0f582d5bcfc9-kube-api-access-848zr\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.184203 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-scripts\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.184262 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78208796-d6b5-472e-9f4b-0f582d5bcfc9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.184356 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-config-data\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.285673 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-848zr\" (UniqueName: \"kubernetes.io/projected/78208796-d6b5-472e-9f4b-0f582d5bcfc9-kube-api-access-848zr\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.285739 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-scripts\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.285767 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78208796-d6b5-472e-9f4b-0f582d5bcfc9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.285807 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-config-data\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.285877 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.285911 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.285903 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78208796-d6b5-472e-9f4b-0f582d5bcfc9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.290144 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.290681 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.294351 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-scripts\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.294676 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78208796-d6b5-472e-9f4b-0f582d5bcfc9-config-data\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.309081 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-848zr\" (UniqueName: \"kubernetes.io/projected/78208796-d6b5-472e-9f4b-0f582d5bcfc9-kube-api-access-848zr\") pod \"cinder-scheduler-0\" (UID: \"78208796-d6b5-472e-9f4b-0f582d5bcfc9\") " pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.473980 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 17:35:35 crc kubenswrapper[4803]: I0320 17:35:35.864027 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 17:35:36 crc kubenswrapper[4803]: I0320 17:35:36.008000 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 17:35:36 crc kubenswrapper[4803]: I0320 17:35:36.066186 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78208796-d6b5-472e-9f4b-0f582d5bcfc9","Type":"ContainerStarted","Data":"f99b877f21f4781bcba01ba5b8db406a12da5dfd6df5449ba81a3aca81b9b91e"} Mar 20 17:35:36 crc kubenswrapper[4803]: I0320 17:35:36.518757 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:36 crc kubenswrapper[4803]: I0320 17:35:36.519501 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:36 crc kubenswrapper[4803]: I0320 17:35:36.869889 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308526b2-6f01-494f-97c8-67e2af66a463" path="/var/lib/kubelet/pods/308526b2-6f01-494f-97c8-67e2af66a463/volumes" Mar 20 17:35:37 crc kubenswrapper[4803]: I0320 17:35:37.086980 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78208796-d6b5-472e-9f4b-0f582d5bcfc9","Type":"ContainerStarted","Data":"48b0c666eb44dea3e835bb6ebe904a8a1de9455e9cfa20f953aa3d9293699489"} Mar 20 17:35:38 crc kubenswrapper[4803]: I0320 17:35:38.100165 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78208796-d6b5-472e-9f4b-0f582d5bcfc9","Type":"ContainerStarted","Data":"9fc456b2fda0493f06537384efc17481fe6d63f10433e5f187893d2daf747f94"} Mar 20 17:35:38 crc kubenswrapper[4803]: I0320 17:35:38.126743 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.126727423 podStartE2EDuration="3.126727423s" podCreationTimestamp="2026-03-20 17:35:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:38.121893822 +0000 UTC m=+1148.033485912" watchObservedRunningTime="2026-03-20 17:35:38.126727423 +0000 UTC m=+1148.038319493" Mar 20 17:35:38 crc kubenswrapper[4803]: I0320 17:35:38.246029 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:35:38 crc kubenswrapper[4803]: I0320 17:35:38.246130 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:35:38 crc kubenswrapper[4803]: I0320 17:35:38.396508 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": dial tcp 10.217.0.167:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 20 17:35:38 crc kubenswrapper[4803]: I0320 17:35:38.398199 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-68dc9dd7c8-7lvg9" podUID="6ba8dd26-ada6-4079-8ff8-4e7fdc34e6b8" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.167:9311/healthcheck\": dial tcp 10.217.0.167:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 20 17:35:39 crc kubenswrapper[4803]: I0320 17:35:39.559372 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:39 crc kubenswrapper[4803]: I0320 17:35:39.559987 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5984b77b84-4dhqv" Mar 20 17:35:39 crc kubenswrapper[4803]: I0320 17:35:39.676870 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68bd594944-fvgn8"] Mar 20 17:35:39 crc kubenswrapper[4803]: I0320 17:35:39.677222 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68bd594944-fvgn8" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerName="placement-log" containerID="cri-o://6ab2cf5f45350281da36b5c6137444d215cbfd248825c7960db825bb0ae4aaa9" gracePeriod=30 Mar 20 17:35:39 crc kubenswrapper[4803]: I0320 17:35:39.677265 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-68bd594944-fvgn8" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerName="placement-api" containerID="cri-o://af5b4baeba5181ab821cd4086bfa7f3a4696767e199b4932917faa1367122571" gracePeriod=30 Mar 20 17:35:39 crc kubenswrapper[4803]: I0320 17:35:39.778407 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-85847589bb-9pbbf" Mar 20 17:35:40 crc kubenswrapper[4803]: I0320 17:35:40.121170 4803 generic.go:334] "Generic (PLEG): container finished" podID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerID="6ab2cf5f45350281da36b5c6137444d215cbfd248825c7960db825bb0ae4aaa9" exitCode=143 Mar 20 17:35:40 crc kubenswrapper[4803]: I0320 17:35:40.121260 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bd594944-fvgn8" event={"ID":"c7e954db-f75f-4ab2-93af-c3f0738748ac","Type":"ContainerDied","Data":"6ab2cf5f45350281da36b5c6137444d215cbfd248825c7960db825bb0ae4aaa9"} Mar 20 17:35:40 crc kubenswrapper[4803]: I0320 17:35:40.474393 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 17:35:41 crc kubenswrapper[4803]: I0320 17:35:41.663344 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75b65b9966-sh4pn" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.152598 4803 generic.go:334] "Generic (PLEG): container finished" podID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerID="af5b4baeba5181ab821cd4086bfa7f3a4696767e199b4932917faa1367122571" exitCode=0 Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.152791 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bd594944-fvgn8" event={"ID":"c7e954db-f75f-4ab2-93af-c3f0738748ac","Type":"ContainerDied","Data":"af5b4baeba5181ab821cd4086bfa7f3a4696767e199b4932917faa1367122571"} Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.230456 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.355753 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 17:35:43 crc kubenswrapper[4803]: E0320 17:35:43.359292 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerName="placement-api" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.359443 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerName="placement-api" Mar 20 17:35:43 crc kubenswrapper[4803]: E0320 17:35:43.359555 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerName="placement-log" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.359610 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerName="placement-log" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.360421 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerName="placement-api" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.360548 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" containerName="placement-log" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.361496 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.368646 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.368949 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hb4s5" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.369140 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.373625 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.389512 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-combined-ca-bundle\") pod \"c7e954db-f75f-4ab2-93af-c3f0738748ac\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.389613 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-internal-tls-certs\") pod \"c7e954db-f75f-4ab2-93af-c3f0738748ac\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.389684 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e954db-f75f-4ab2-93af-c3f0738748ac-logs\") pod \"c7e954db-f75f-4ab2-93af-c3f0738748ac\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.389730 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-config-data\") pod \"c7e954db-f75f-4ab2-93af-c3f0738748ac\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.389816 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-public-tls-certs\") pod \"c7e954db-f75f-4ab2-93af-c3f0738748ac\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.389887 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zblsj\" (UniqueName: \"kubernetes.io/projected/c7e954db-f75f-4ab2-93af-c3f0738748ac-kube-api-access-zblsj\") pod \"c7e954db-f75f-4ab2-93af-c3f0738748ac\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.389965 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-scripts\") pod \"c7e954db-f75f-4ab2-93af-c3f0738748ac\" (UID: \"c7e954db-f75f-4ab2-93af-c3f0738748ac\") " Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.390308 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7e954db-f75f-4ab2-93af-c3f0738748ac-logs" (OuterVolumeSpecName: "logs") pod "c7e954db-f75f-4ab2-93af-c3f0738748ac" (UID: "c7e954db-f75f-4ab2-93af-c3f0738748ac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.404019 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7e954db-f75f-4ab2-93af-c3f0738748ac-kube-api-access-zblsj" (OuterVolumeSpecName: "kube-api-access-zblsj") pod "c7e954db-f75f-4ab2-93af-c3f0738748ac" (UID: "c7e954db-f75f-4ab2-93af-c3f0738748ac"). InnerVolumeSpecName "kube-api-access-zblsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.407595 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-scripts" (OuterVolumeSpecName: "scripts") pod "c7e954db-f75f-4ab2-93af-c3f0738748ac" (UID: "c7e954db-f75f-4ab2-93af-c3f0738748ac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.441161 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-config-data" (OuterVolumeSpecName: "config-data") pod "c7e954db-f75f-4ab2-93af-c3f0738748ac" (UID: "c7e954db-f75f-4ab2-93af-c3f0738748ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.470300 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7e954db-f75f-4ab2-93af-c3f0738748ac" (UID: "c7e954db-f75f-4ab2-93af-c3f0738748ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.486551 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c7e954db-f75f-4ab2-93af-c3f0738748ac" (UID: "c7e954db-f75f-4ab2-93af-c3f0738748ac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.491979 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d15f9821-6db3-4c08-b731-d0c7349b4076-openstack-config\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492111 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d15f9821-6db3-4c08-b731-d0c7349b4076-openstack-config-secret\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492181 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzf75\" (UniqueName: \"kubernetes.io/projected/d15f9821-6db3-4c08-b731-d0c7349b4076-kube-api-access-xzf75\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492205 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15f9821-6db3-4c08-b731-d0c7349b4076-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492275 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c7e954db-f75f-4ab2-93af-c3f0738748ac-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492290 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492304 4803 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492318 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zblsj\" (UniqueName: \"kubernetes.io/projected/c7e954db-f75f-4ab2-93af-c3f0738748ac-kube-api-access-zblsj\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492330 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.492340 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.503152 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c7e954db-f75f-4ab2-93af-c3f0738748ac" (UID: "c7e954db-f75f-4ab2-93af-c3f0738748ac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.593862 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d15f9821-6db3-4c08-b731-d0c7349b4076-openstack-config-secret\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.593930 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzf75\" (UniqueName: \"kubernetes.io/projected/d15f9821-6db3-4c08-b731-d0c7349b4076-kube-api-access-xzf75\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.593953 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15f9821-6db3-4c08-b731-d0c7349b4076-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.594008 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d15f9821-6db3-4c08-b731-d0c7349b4076-openstack-config\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.594083 4803 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7e954db-f75f-4ab2-93af-c3f0738748ac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.594859 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d15f9821-6db3-4c08-b731-d0c7349b4076-openstack-config\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.599969 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d15f9821-6db3-4c08-b731-d0c7349b4076-openstack-config-secret\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.600155 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d15f9821-6db3-4c08-b731-d0c7349b4076-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.612212 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzf75\" (UniqueName: \"kubernetes.io/projected/d15f9821-6db3-4c08-b731-d0c7349b4076-kube-api-access-xzf75\") pod \"openstackclient\" (UID: \"d15f9821-6db3-4c08-b731-d0c7349b4076\") " pod="openstack/openstackclient" Mar 20 17:35:43 crc kubenswrapper[4803]: I0320 17:35:43.688685 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 17:35:44 crc kubenswrapper[4803]: I0320 17:35:44.161719 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68bd594944-fvgn8" event={"ID":"c7e954db-f75f-4ab2-93af-c3f0738748ac","Type":"ContainerDied","Data":"fb545e96b631a2d935922a41a38694bf2f88a00a41527992a85d2e3be6b73b3b"} Mar 20 17:35:44 crc kubenswrapper[4803]: I0320 17:35:44.162071 4803 scope.go:117] "RemoveContainer" containerID="af5b4baeba5181ab821cd4086bfa7f3a4696767e199b4932917faa1367122571" Mar 20 17:35:44 crc kubenswrapper[4803]: I0320 17:35:44.161770 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68bd594944-fvgn8" Mar 20 17:35:44 crc kubenswrapper[4803]: I0320 17:35:44.188408 4803 scope.go:117] "RemoveContainer" containerID="6ab2cf5f45350281da36b5c6137444d215cbfd248825c7960db825bb0ae4aaa9" Mar 20 17:35:44 crc kubenswrapper[4803]: I0320 17:35:44.194269 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68bd594944-fvgn8"] Mar 20 17:35:44 crc kubenswrapper[4803]: I0320 17:35:44.201432 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 17:35:44 crc kubenswrapper[4803]: I0320 17:35:44.208430 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-68bd594944-fvgn8"] Mar 20 17:35:44 crc kubenswrapper[4803]: I0320 17:35:44.856877 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7e954db-f75f-4ab2-93af-c3f0738748ac" path="/var/lib/kubelet/pods/c7e954db-f75f-4ab2-93af-c3f0738748ac/volumes" Mar 20 17:35:45 crc kubenswrapper[4803]: I0320 17:35:45.169700 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d15f9821-6db3-4c08-b731-d0c7349b4076","Type":"ContainerStarted","Data":"246495ae4e8bdb1d647b648f9795a1d97cee2ec49ddcc43667b6d77e80c59c15"} Mar 20 17:35:45 crc kubenswrapper[4803]: I0320 17:35:45.784188 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.230285 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5598996667-w8j76"] Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.232602 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.241614 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.242368 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.243261 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.249438 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5598996667-w8j76"] Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.377124 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckdzq\" (UniqueName: \"kubernetes.io/projected/8b611951-3d24-49d2-a5bc-18d41e478610-kube-api-access-ckdzq\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.377201 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-internal-tls-certs\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.377241 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-combined-ca-bundle\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.377283 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b611951-3d24-49d2-a5bc-18d41e478610-log-httpd\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.377302 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b611951-3d24-49d2-a5bc-18d41e478610-run-httpd\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.377319 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-public-tls-certs\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.377343 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-config-data\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.377360 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b611951-3d24-49d2-a5bc-18d41e478610-etc-swift\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.478954 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b611951-3d24-49d2-a5bc-18d41e478610-log-httpd\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.478990 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b611951-3d24-49d2-a5bc-18d41e478610-run-httpd\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.479010 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-public-tls-certs\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.479034 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-config-data\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.479056 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b611951-3d24-49d2-a5bc-18d41e478610-etc-swift\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.479114 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckdzq\" (UniqueName: \"kubernetes.io/projected/8b611951-3d24-49d2-a5bc-18d41e478610-kube-api-access-ckdzq\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.479155 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-internal-tls-certs\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.479190 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-combined-ca-bundle\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.480462 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b611951-3d24-49d2-a5bc-18d41e478610-run-httpd\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.480709 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8b611951-3d24-49d2-a5bc-18d41e478610-log-httpd\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.489221 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-config-data\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.489798 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-combined-ca-bundle\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.490188 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-public-tls-certs\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.490667 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8b611951-3d24-49d2-a5bc-18d41e478610-etc-swift\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.495176 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8b611951-3d24-49d2-a5bc-18d41e478610-internal-tls-certs\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.514342 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckdzq\" (UniqueName: \"kubernetes.io/projected/8b611951-3d24-49d2-a5bc-18d41e478610-kube-api-access-ckdzq\") pod \"swift-proxy-5598996667-w8j76\" (UID: \"8b611951-3d24-49d2-a5bc-18d41e478610\") " pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:47 crc kubenswrapper[4803]: I0320 17:35:47.559629 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:48 crc kubenswrapper[4803]: I0320 17:35:48.104945 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5598996667-w8j76"] Mar 20 17:35:48 crc kubenswrapper[4803]: I0320 17:35:48.198018 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598996667-w8j76" event={"ID":"8b611951-3d24-49d2-a5bc-18d41e478610","Type":"ContainerStarted","Data":"b373e7a3dc6e8c7f8a28e1ebdc6305e6f284316e32425f2d4eab9da910901e53"} Mar 20 17:35:48 crc kubenswrapper[4803]: I0320 17:35:48.459625 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:48 crc kubenswrapper[4803]: I0320 17:35:48.460001 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="ceilometer-central-agent" containerID="cri-o://bff37b47bc7555f549b0c6203954ec0d02e36d7a6dba536236b062330184be15" gracePeriod=30 Mar 20 17:35:48 crc kubenswrapper[4803]: I0320 17:35:48.460149 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="proxy-httpd" containerID="cri-o://af3018a478f58bc49449e6e7cba6f378481703f4bc819c22afd93d43e3228f30" gracePeriod=30 Mar 20 17:35:48 crc kubenswrapper[4803]: I0320 17:35:48.460203 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="sg-core" containerID="cri-o://936bb04a57759b534c0f972fdb52034feee0bcf9146e96363e79ff604490434a" gracePeriod=30 Mar 20 17:35:48 crc kubenswrapper[4803]: I0320 17:35:48.460261 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="ceilometer-notification-agent" containerID="cri-o://4b7680dd7ce3f95d2239bf669dacd6a291aef4b734cafb33bbfd0cb3b924ad73" gracePeriod=30 Mar 20 17:35:48 crc kubenswrapper[4803]: I0320 17:35:48.470423 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.173:3000/\": EOF" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.208969 4803 generic.go:334] "Generic (PLEG): container finished" podID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerID="af3018a478f58bc49449e6e7cba6f378481703f4bc819c22afd93d43e3228f30" exitCode=0 Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.209220 4803 generic.go:334] "Generic (PLEG): container finished" podID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerID="936bb04a57759b534c0f972fdb52034feee0bcf9146e96363e79ff604490434a" exitCode=2 Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.209229 4803 generic.go:334] "Generic (PLEG): container finished" podID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerID="4b7680dd7ce3f95d2239bf669dacd6a291aef4b734cafb33bbfd0cb3b924ad73" exitCode=0 Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.209236 4803 generic.go:334] "Generic (PLEG): container finished" podID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerID="bff37b47bc7555f549b0c6203954ec0d02e36d7a6dba536236b062330184be15" exitCode=0 Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.209271 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerDied","Data":"af3018a478f58bc49449e6e7cba6f378481703f4bc819c22afd93d43e3228f30"} Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.209344 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerDied","Data":"936bb04a57759b534c0f972fdb52034feee0bcf9146e96363e79ff604490434a"} Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.209360 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerDied","Data":"4b7680dd7ce3f95d2239bf669dacd6a291aef4b734cafb33bbfd0cb3b924ad73"} Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.209371 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerDied","Data":"bff37b47bc7555f549b0c6203954ec0d02e36d7a6dba536236b062330184be15"} Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.211376 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598996667-w8j76" event={"ID":"8b611951-3d24-49d2-a5bc-18d41e478610","Type":"ContainerStarted","Data":"af7b81edfa021b84d644f26073c04b120e0b81b243ec0093f15f9d44e19350b6"} Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.211411 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5598996667-w8j76" event={"ID":"8b611951-3d24-49d2-a5bc-18d41e478610","Type":"ContainerStarted","Data":"9b4bf3cab7f1489ba882153cd02ee54131527676750d2128999df2e997a4e2d9"} Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.211895 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.233613 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5598996667-w8j76" podStartSLOduration=2.233594122 podStartE2EDuration="2.233594122s" podCreationTimestamp="2026-03-20 17:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:35:49.23055717 +0000 UTC m=+1159.142149250" watchObservedRunningTime="2026-03-20 17:35:49.233594122 +0000 UTC m=+1159.145186192" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.274137 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.327230 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-combined-ca-bundle\") pod \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.327272 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-log-httpd\") pod \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.327292 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cngv5\" (UniqueName: \"kubernetes.io/projected/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-kube-api-access-cngv5\") pod \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.327460 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-scripts\") pod \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.327484 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-sg-core-conf-yaml\") pod \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.327516 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-run-httpd\") pod \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.327569 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-config-data\") pod \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\" (UID: \"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91\") " Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.330768 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" (UID: "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.331218 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" (UID: "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.335926 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-scripts" (OuterVolumeSpecName: "scripts") pod "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" (UID: "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.336032 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-kube-api-access-cngv5" (OuterVolumeSpecName: "kube-api-access-cngv5") pod "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" (UID: "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91"). InnerVolumeSpecName "kube-api-access-cngv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.361908 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" (UID: "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.406747 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" (UID: "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.430147 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.430183 4803 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.430199 4803 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.430211 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.430223 4803 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.430237 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cngv5\" (UniqueName: \"kubernetes.io/projected/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-kube-api-access-cngv5\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.431342 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-config-data" (OuterVolumeSpecName: "config-data") pod "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" (UID: "7d5fb2c1-3925-4353-a0f9-c3e00ac43b91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:49 crc kubenswrapper[4803]: I0320 17:35:49.532284 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.224588 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d5fb2c1-3925-4353-a0f9-c3e00ac43b91","Type":"ContainerDied","Data":"6604352edbf5672477a53f226bfa7a69c5155e198737ed69726600ee44b8d7fe"} Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.224842 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.224861 4803 scope.go:117] "RemoveContainer" containerID="af3018a478f58bc49449e6e7cba6f378481703f4bc819c22afd93d43e3228f30" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.224620 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.259963 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.281398 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296071 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:50 crc kubenswrapper[4803]: E0320 17:35:50.296479 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="ceilometer-central-agent" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296498 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="ceilometer-central-agent" Mar 20 17:35:50 crc kubenswrapper[4803]: E0320 17:35:50.296534 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="ceilometer-notification-agent" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296542 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="ceilometer-notification-agent" Mar 20 17:35:50 crc kubenswrapper[4803]: E0320 17:35:50.296562 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="proxy-httpd" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296568 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="proxy-httpd" Mar 20 17:35:50 crc kubenswrapper[4803]: E0320 17:35:50.296584 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="sg-core" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296589 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="sg-core" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296737 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="ceilometer-central-agent" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296747 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="proxy-httpd" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296765 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="sg-core" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.296778 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" containerName="ceilometer-notification-agent" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.298667 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.300752 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.301033 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.311800 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.350986 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.351258 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-run-httpd\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.351386 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-config-data\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.351497 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-log-httpd\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.351605 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.351721 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-scripts\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.351814 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7csq8\" (UniqueName: \"kubernetes.io/projected/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-kube-api-access-7csq8\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.452687 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-run-httpd\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.452871 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-config-data\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.452960 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-log-httpd\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.453068 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.453147 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-scripts\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.453208 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7csq8\" (UniqueName: \"kubernetes.io/projected/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-kube-api-access-7csq8\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.453334 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.454233 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-run-httpd\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.455142 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-log-httpd\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.458701 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.459047 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.459709 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-scripts\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.476483 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-config-data\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.478011 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7csq8\" (UniqueName: \"kubernetes.io/projected/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-kube-api-access-7csq8\") pod \"ceilometer-0\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.647944 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:35:50 crc kubenswrapper[4803]: I0320 17:35:50.882868 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d5fb2c1-3925-4353-a0f9-c3e00ac43b91" path="/var/lib/kubelet/pods/7d5fb2c1-3925-4353-a0f9-c3e00ac43b91/volumes" Mar 20 17:35:51 crc kubenswrapper[4803]: I0320 17:35:51.465039 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:35:51 crc kubenswrapper[4803]: I0320 17:35:51.469281 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" containerName="glance-log" containerID="cri-o://f0240cdb4bc1cd10ecc35ba3f331154b3247f37c1243547d2fa84fce4534a75a" gracePeriod=30 Mar 20 17:35:51 crc kubenswrapper[4803]: I0320 17:35:51.469554 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" containerName="glance-httpd" containerID="cri-o://fbee115a69c076117905f8a61c000191d7b543588ff6c357136ee8e193540c80" gracePeriod=30 Mar 20 17:35:51 crc kubenswrapper[4803]: I0320 17:35:51.663896 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-75b65b9966-sh4pn" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 20 17:35:51 crc kubenswrapper[4803]: I0320 17:35:51.664050 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:35:51 crc kubenswrapper[4803]: I0320 17:35:51.994718 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:35:51 crc kubenswrapper[4803]: I0320 17:35:51.994935 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerName="glance-log" containerID="cri-o://df685bb06f31c910c8919b80d9b49a695a69dca779ad626488ca56799fd989e1" gracePeriod=30 Mar 20 17:35:51 crc kubenswrapper[4803]: I0320 17:35:51.995313 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerName="glance-httpd" containerID="cri-o://c3e79caec9548dc0bbc047750096a62108fde9eb28fbc970e8a22062d4aad3b6" gracePeriod=30 Mar 20 17:35:52 crc kubenswrapper[4803]: I0320 17:35:52.246457 4803 generic.go:334] "Generic (PLEG): container finished" podID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerID="df685bb06f31c910c8919b80d9b49a695a69dca779ad626488ca56799fd989e1" exitCode=143 Mar 20 17:35:52 crc kubenswrapper[4803]: I0320 17:35:52.246544 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f22bff9a-80b3-41a7-9f0d-337b7c61b46a","Type":"ContainerDied","Data":"df685bb06f31c910c8919b80d9b49a695a69dca779ad626488ca56799fd989e1"} Mar 20 17:35:52 crc kubenswrapper[4803]: I0320 17:35:52.251249 4803 generic.go:334] "Generic (PLEG): container finished" podID="824aff30-6d5e-4489-bf27-79910aafe31e" containerID="f0240cdb4bc1cd10ecc35ba3f331154b3247f37c1243547d2fa84fce4534a75a" exitCode=143 Mar 20 17:35:52 crc kubenswrapper[4803]: I0320 17:35:52.251292 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"824aff30-6d5e-4489-bf27-79910aafe31e","Type":"ContainerDied","Data":"f0240cdb4bc1cd10ecc35ba3f331154b3247f37c1243547d2fa84fce4534a75a"} Mar 20 17:35:53 crc kubenswrapper[4803]: I0320 17:35:53.523362 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.542567 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-x8dss"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.543632 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.554284 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x8dss"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.656761 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-055e-account-create-update-qzqpx"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.657994 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.660744 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.669398 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9zd4x"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.670688 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.682192 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-055e-account-create-update-qzqpx"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.691803 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9zd4x"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.694140 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpb6c\" (UniqueName: \"kubernetes.io/projected/e7fbac44-b743-4049-ae33-92d8fdb12dfd-kube-api-access-jpb6c\") pod \"nova-api-db-create-x8dss\" (UID: \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\") " pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.694248 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbac44-b743-4049-ae33-92d8fdb12dfd-operator-scripts\") pod \"nova-api-db-create-x8dss\" (UID: \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\") " pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.770168 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-t2hrj"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.771260 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.776773 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t2hrj"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.796395 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbac44-b743-4049-ae33-92d8fdb12dfd-operator-scripts\") pod \"nova-api-db-create-x8dss\" (UID: \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\") " pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.796441 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkrs2\" (UniqueName: \"kubernetes.io/projected/ae48d528-5867-44e8-a58b-7a035fd1c0a7-kube-api-access-rkrs2\") pod \"nova-cell0-db-create-9zd4x\" (UID: \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\") " pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.796538 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpb6c\" (UniqueName: \"kubernetes.io/projected/e7fbac44-b743-4049-ae33-92d8fdb12dfd-kube-api-access-jpb6c\") pod \"nova-api-db-create-x8dss\" (UID: \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\") " pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.796573 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae48d528-5867-44e8-a58b-7a035fd1c0a7-operator-scripts\") pod \"nova-cell0-db-create-9zd4x\" (UID: \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\") " pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.796615 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75c75d4-5e6d-40b0-831a-4114a94c6c64-operator-scripts\") pod \"nova-api-055e-account-create-update-qzqpx\" (UID: \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\") " pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.796650 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcdcc\" (UniqueName: \"kubernetes.io/projected/f75c75d4-5e6d-40b0-831a-4114a94c6c64-kube-api-access-dcdcc\") pod \"nova-api-055e-account-create-update-qzqpx\" (UID: \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\") " pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.797316 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbac44-b743-4049-ae33-92d8fdb12dfd-operator-scripts\") pod \"nova-api-db-create-x8dss\" (UID: \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\") " pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.815383 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpb6c\" (UniqueName: \"kubernetes.io/projected/e7fbac44-b743-4049-ae33-92d8fdb12dfd-kube-api-access-jpb6c\") pod \"nova-api-db-create-x8dss\" (UID: \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\") " pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.859846 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ded6-account-create-update-b2sc5"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.860836 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.863248 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.866418 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ded6-account-create-update-b2sc5"] Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.897973 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75c75d4-5e6d-40b0-831a-4114a94c6c64-operator-scripts\") pod \"nova-api-055e-account-create-update-qzqpx\" (UID: \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\") " pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.898050 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcdcc\" (UniqueName: \"kubernetes.io/projected/f75c75d4-5e6d-40b0-831a-4114a94c6c64-kube-api-access-dcdcc\") pod \"nova-api-055e-account-create-update-qzqpx\" (UID: \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\") " pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.898104 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkrs2\" (UniqueName: \"kubernetes.io/projected/ae48d528-5867-44e8-a58b-7a035fd1c0a7-kube-api-access-rkrs2\") pod \"nova-cell0-db-create-9zd4x\" (UID: \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\") " pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.898195 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45b6fb6-a659-4a13-96d8-2f41634e3423-operator-scripts\") pod \"nova-cell1-db-create-t2hrj\" (UID: \"a45b6fb6-a659-4a13-96d8-2f41634e3423\") " pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.898225 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae48d528-5867-44e8-a58b-7a035fd1c0a7-operator-scripts\") pod \"nova-cell0-db-create-9zd4x\" (UID: \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\") " pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.898259 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdmsw\" (UniqueName: \"kubernetes.io/projected/a45b6fb6-a659-4a13-96d8-2f41634e3423-kube-api-access-tdmsw\") pod \"nova-cell1-db-create-t2hrj\" (UID: \"a45b6fb6-a659-4a13-96d8-2f41634e3423\") " pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.898733 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75c75d4-5e6d-40b0-831a-4114a94c6c64-operator-scripts\") pod \"nova-api-055e-account-create-update-qzqpx\" (UID: \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\") " pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.898943 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae48d528-5867-44e8-a58b-7a035fd1c0a7-operator-scripts\") pod \"nova-cell0-db-create-9zd4x\" (UID: \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\") " pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.914904 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.917866 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkrs2\" (UniqueName: \"kubernetes.io/projected/ae48d528-5867-44e8-a58b-7a035fd1c0a7-kube-api-access-rkrs2\") pod \"nova-cell0-db-create-9zd4x\" (UID: \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\") " pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:54 crc kubenswrapper[4803]: I0320 17:35:54.929214 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcdcc\" (UniqueName: \"kubernetes.io/projected/f75c75d4-5e6d-40b0-831a-4114a94c6c64-kube-api-access-dcdcc\") pod \"nova-api-055e-account-create-update-qzqpx\" (UID: \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\") " pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:54.999329 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdmsw\" (UniqueName: \"kubernetes.io/projected/a45b6fb6-a659-4a13-96d8-2f41634e3423-kube-api-access-tdmsw\") pod \"nova-cell1-db-create-t2hrj\" (UID: \"a45b6fb6-a659-4a13-96d8-2f41634e3423\") " pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:54.999415 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7km\" (UniqueName: \"kubernetes.io/projected/ad456b82-4bac-43be-b409-afcea1169a3a-kube-api-access-sh7km\") pod \"nova-cell0-ded6-account-create-update-b2sc5\" (UID: \"ad456b82-4bac-43be-b409-afcea1169a3a\") " pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:54.999587 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45b6fb6-a659-4a13-96d8-2f41634e3423-operator-scripts\") pod \"nova-cell1-db-create-t2hrj\" (UID: \"a45b6fb6-a659-4a13-96d8-2f41634e3423\") " pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:54.999611 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad456b82-4bac-43be-b409-afcea1169a3a-operator-scripts\") pod \"nova-cell0-ded6-account-create-update-b2sc5\" (UID: \"ad456b82-4bac-43be-b409-afcea1169a3a\") " pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.002916 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45b6fb6-a659-4a13-96d8-2f41634e3423-operator-scripts\") pod \"nova-cell1-db-create-t2hrj\" (UID: \"a45b6fb6-a659-4a13-96d8-2f41634e3423\") " pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.017058 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdmsw\" (UniqueName: \"kubernetes.io/projected/a45b6fb6-a659-4a13-96d8-2f41634e3423-kube-api-access-tdmsw\") pod \"nova-cell1-db-create-t2hrj\" (UID: \"a45b6fb6-a659-4a13-96d8-2f41634e3423\") " pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.022911 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.033666 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.055869 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-da07-account-create-update-9c6n2"] Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.057382 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.061229 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.069036 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-da07-account-create-update-9c6n2"] Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.086439 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.100703 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad456b82-4bac-43be-b409-afcea1169a3a-operator-scripts\") pod \"nova-cell0-ded6-account-create-update-b2sc5\" (UID: \"ad456b82-4bac-43be-b409-afcea1169a3a\") " pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.100801 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7km\" (UniqueName: \"kubernetes.io/projected/ad456b82-4bac-43be-b409-afcea1169a3a-kube-api-access-sh7km\") pod \"nova-cell0-ded6-account-create-update-b2sc5\" (UID: \"ad456b82-4bac-43be-b409-afcea1169a3a\") " pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.101640 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad456b82-4bac-43be-b409-afcea1169a3a-operator-scripts\") pod \"nova-cell0-ded6-account-create-update-b2sc5\" (UID: \"ad456b82-4bac-43be-b409-afcea1169a3a\") " pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.118657 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7km\" (UniqueName: \"kubernetes.io/projected/ad456b82-4bac-43be-b409-afcea1169a3a-kube-api-access-sh7km\") pod \"nova-cell0-ded6-account-create-update-b2sc5\" (UID: \"ad456b82-4bac-43be-b409-afcea1169a3a\") " pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.179246 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.202574 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8fd306c-b4b4-4f8a-812c-4434756c38dc-operator-scripts\") pod \"nova-cell1-da07-account-create-update-9c6n2\" (UID: \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\") " pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.202617 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njn7s\" (UniqueName: \"kubernetes.io/projected/b8fd306c-b4b4-4f8a-812c-4434756c38dc-kube-api-access-njn7s\") pod \"nova-cell1-da07-account-create-update-9c6n2\" (UID: \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\") " pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.295320 4803 generic.go:334] "Generic (PLEG): container finished" podID="824aff30-6d5e-4489-bf27-79910aafe31e" containerID="fbee115a69c076117905f8a61c000191d7b543588ff6c357136ee8e193540c80" exitCode=0 Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.295416 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"824aff30-6d5e-4489-bf27-79910aafe31e","Type":"ContainerDied","Data":"fbee115a69c076117905f8a61c000191d7b543588ff6c357136ee8e193540c80"} Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.298748 4803 generic.go:334] "Generic (PLEG): container finished" podID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerID="c3e79caec9548dc0bbc047750096a62108fde9eb28fbc970e8a22062d4aad3b6" exitCode=0 Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.298795 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f22bff9a-80b3-41a7-9f0d-337b7c61b46a","Type":"ContainerDied","Data":"c3e79caec9548dc0bbc047750096a62108fde9eb28fbc970e8a22062d4aad3b6"} Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.305636 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8fd306c-b4b4-4f8a-812c-4434756c38dc-operator-scripts\") pod \"nova-cell1-da07-account-create-update-9c6n2\" (UID: \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\") " pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.305673 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njn7s\" (UniqueName: \"kubernetes.io/projected/b8fd306c-b4b4-4f8a-812c-4434756c38dc-kube-api-access-njn7s\") pod \"nova-cell1-da07-account-create-update-9c6n2\" (UID: \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\") " pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.315874 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8fd306c-b4b4-4f8a-812c-4434756c38dc-operator-scripts\") pod \"nova-cell1-da07-account-create-update-9c6n2\" (UID: \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\") " pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.321936 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njn7s\" (UniqueName: \"kubernetes.io/projected/b8fd306c-b4b4-4f8a-812c-4434756c38dc-kube-api-access-njn7s\") pod \"nova-cell1-da07-account-create-update-9c6n2\" (UID: \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\") " pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.361849 4803 scope.go:117] "RemoveContainer" containerID="936bb04a57759b534c0f972fdb52034feee0bcf9146e96363e79ff604490434a" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.371870 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.383288 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6b69588b57-phch4" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.464436 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-799cb79944-p24nt"] Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.464885 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-799cb79944-p24nt" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerName="neutron-api" containerID="cri-o://3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e" gracePeriod=30 Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.465096 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-799cb79944-p24nt" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerName="neutron-httpd" containerID="cri-o://742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d" gracePeriod=30 Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.643024 4803 scope.go:117] "RemoveContainer" containerID="4b7680dd7ce3f95d2239bf669dacd6a291aef4b734cafb33bbfd0cb3b924ad73" Mar 20 17:35:55 crc kubenswrapper[4803]: I0320 17:35:55.764239 4803 scope.go:117] "RemoveContainer" containerID="bff37b47bc7555f549b0c6203954ec0d02e36d7a6dba536236b062330184be15" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.139642 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.238257 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-public-tls-certs\") pod \"824aff30-6d5e-4489-bf27-79910aafe31e\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.238354 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-combined-ca-bundle\") pod \"824aff30-6d5e-4489-bf27-79910aafe31e\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.238387 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"824aff30-6d5e-4489-bf27-79910aafe31e\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.238478 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-config-data\") pod \"824aff30-6d5e-4489-bf27-79910aafe31e\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.238535 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-scripts\") pod \"824aff30-6d5e-4489-bf27-79910aafe31e\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.238552 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzpk9\" (UniqueName: \"kubernetes.io/projected/824aff30-6d5e-4489-bf27-79910aafe31e-kube-api-access-zzpk9\") pod \"824aff30-6d5e-4489-bf27-79910aafe31e\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.238611 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-httpd-run\") pod \"824aff30-6d5e-4489-bf27-79910aafe31e\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.238671 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-logs\") pod \"824aff30-6d5e-4489-bf27-79910aafe31e\" (UID: \"824aff30-6d5e-4489-bf27-79910aafe31e\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.239371 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-logs" (OuterVolumeSpecName: "logs") pod "824aff30-6d5e-4489-bf27-79910aafe31e" (UID: "824aff30-6d5e-4489-bf27-79910aafe31e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.248674 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "824aff30-6d5e-4489-bf27-79910aafe31e" (UID: "824aff30-6d5e-4489-bf27-79910aafe31e"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.250596 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "824aff30-6d5e-4489-bf27-79910aafe31e" (UID: "824aff30-6d5e-4489-bf27-79910aafe31e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.254747 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824aff30-6d5e-4489-bf27-79910aafe31e-kube-api-access-zzpk9" (OuterVolumeSpecName: "kube-api-access-zzpk9") pod "824aff30-6d5e-4489-bf27-79910aafe31e" (UID: "824aff30-6d5e-4489-bf27-79910aafe31e"). InnerVolumeSpecName "kube-api-access-zzpk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.281685 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-scripts" (OuterVolumeSpecName: "scripts") pod "824aff30-6d5e-4489-bf27-79910aafe31e" (UID: "824aff30-6d5e-4489-bf27-79910aafe31e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.292393 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "824aff30-6d5e-4489-bf27-79910aafe31e" (UID: "824aff30-6d5e-4489-bf27-79910aafe31e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.310800 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "824aff30-6d5e-4489-bf27-79910aafe31e" (UID: "824aff30-6d5e-4489-bf27-79910aafe31e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.333889 4803 generic.go:334] "Generic (PLEG): container finished" podID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerID="742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d" exitCode=0 Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.334195 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-799cb79944-p24nt" event={"ID":"ce4ab454-b5e3-435e-b406-ee5891a82b69","Type":"ContainerDied","Data":"742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d"} Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.335955 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"824aff30-6d5e-4489-bf27-79910aafe31e","Type":"ContainerDied","Data":"cb5be6fb0d30073e29afd2569eac9ec86bda75041effcb358329f79dbd107482"} Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.335986 4803 scope.go:117] "RemoveContainer" containerID="fbee115a69c076117905f8a61c000191d7b543588ff6c357136ee8e193540c80" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.336107 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.343386 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d15f9821-6db3-4c08-b731-d0c7349b4076","Type":"ContainerStarted","Data":"3b80c1144824aff31b155f66be431cf7c1f20913639b7b5dd42dae3227edc4d3"} Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.343559 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.343587 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzpk9\" (UniqueName: \"kubernetes.io/projected/824aff30-6d5e-4489-bf27-79910aafe31e-kube-api-access-zzpk9\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.343599 4803 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.343607 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/824aff30-6d5e-4489-bf27-79910aafe31e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.343615 4803 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.343623 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.343643 4803 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.354099 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-config-data" (OuterVolumeSpecName: "config-data") pod "824aff30-6d5e-4489-bf27-79910aafe31e" (UID: "824aff30-6d5e-4489-bf27-79910aafe31e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.371815 4803 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.381981 4803 scope.go:117] "RemoveContainer" containerID="f0240cdb4bc1cd10ecc35ba3f331154b3247f37c1243547d2fa84fce4534a75a" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.421288 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.980532886 podStartE2EDuration="13.421271141s" podCreationTimestamp="2026-03-20 17:35:43 +0000 UTC" firstStartedPulling="2026-03-20 17:35:44.20275426 +0000 UTC m=+1154.114346340" lastFinishedPulling="2026-03-20 17:35:55.643492525 +0000 UTC m=+1165.555084595" observedRunningTime="2026-03-20 17:35:56.365327839 +0000 UTC m=+1166.276919909" watchObservedRunningTime="2026-03-20 17:35:56.421271141 +0000 UTC m=+1166.332863211" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.425139 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ded6-account-create-update-b2sc5"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.439690 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-t2hrj"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.446903 4803 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.446923 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/824aff30-6d5e-4489-bf27-79910aafe31e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.463762 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9zd4x"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.469193 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.596885 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-055e-account-create-update-qzqpx"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.601960 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-x8dss"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.613862 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.693666 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.719256 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.736876 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:35:56 crc kubenswrapper[4803]: E0320 17:35:56.737343 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" containerName="glance-httpd" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.737361 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" containerName="glance-httpd" Mar 20 17:35:56 crc kubenswrapper[4803]: E0320 17:35:56.737377 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" containerName="glance-log" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.737384 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" containerName="glance-log" Mar 20 17:35:56 crc kubenswrapper[4803]: E0320 17:35:56.737394 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerName="glance-httpd" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.737400 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerName="glance-httpd" Mar 20 17:35:56 crc kubenswrapper[4803]: E0320 17:35:56.737413 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerName="glance-log" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.737419 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerName="glance-log" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.737617 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" containerName="glance-log" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.737626 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerName="glance-log" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.737644 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" containerName="glance-httpd" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.737654 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" containerName="glance-httpd" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.738553 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.750146 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.750613 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-da07-account-create-update-9c6n2"] Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.751568 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t772\" (UniqueName: \"kubernetes.io/projected/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-kube-api-access-8t772\") pod \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.751681 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-logs\") pod \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.751725 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-httpd-run\") pod \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.751785 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.751818 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-internal-tls-certs\") pod \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.751836 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-combined-ca-bundle\") pod \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.751893 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-config-data\") pod \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.751920 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-scripts\") pod \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\" (UID: \"f22bff9a-80b3-41a7-9f0d-337b7c61b46a\") " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.767579 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-logs" (OuterVolumeSpecName: "logs") pod "f22bff9a-80b3-41a7-9f0d-337b7c61b46a" (UID: "f22bff9a-80b3-41a7-9f0d-337b7c61b46a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.767820 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f22bff9a-80b3-41a7-9f0d-337b7c61b46a" (UID: "f22bff9a-80b3-41a7-9f0d-337b7c61b46a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.772864 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-scripts" (OuterVolumeSpecName: "scripts") pod "f22bff9a-80b3-41a7-9f0d-337b7c61b46a" (UID: "f22bff9a-80b3-41a7-9f0d-337b7c61b46a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.774647 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.774858 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.775655 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "f22bff9a-80b3-41a7-9f0d-337b7c61b46a" (UID: "f22bff9a-80b3-41a7-9f0d-337b7c61b46a"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.779429 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-kube-api-access-8t772" (OuterVolumeSpecName: "kube-api-access-8t772") pod "f22bff9a-80b3-41a7-9f0d-337b7c61b46a" (UID: "f22bff9a-80b3-41a7-9f0d-337b7c61b46a"). InnerVolumeSpecName "kube-api-access-8t772". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.855776 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856186 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856223 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de8af7e5-2a44-4caa-883c-7ef2027e69c4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856298 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-config-data\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856353 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5c65\" (UniqueName: \"kubernetes.io/projected/de8af7e5-2a44-4caa-883c-7ef2027e69c4-kube-api-access-m5c65\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856433 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de8af7e5-2a44-4caa-883c-7ef2027e69c4-logs\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856460 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-scripts\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856557 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856651 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856664 4803 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856701 4803 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856713 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.856721 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t772\" (UniqueName: \"kubernetes.io/projected/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-kube-api-access-8t772\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.868100 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824aff30-6d5e-4489-bf27-79910aafe31e" path="/var/lib/kubelet/pods/824aff30-6d5e-4489-bf27-79910aafe31e/volumes" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.959000 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.959096 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.959126 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de8af7e5-2a44-4caa-883c-7ef2027e69c4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.959204 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-config-data\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.959242 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5c65\" (UniqueName: \"kubernetes.io/projected/de8af7e5-2a44-4caa-883c-7ef2027e69c4-kube-api-access-m5c65\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.959269 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de8af7e5-2a44-4caa-883c-7ef2027e69c4-logs\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.959315 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-scripts\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.959364 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.960666 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de8af7e5-2a44-4caa-883c-7ef2027e69c4-logs\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.961433 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.961785 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/de8af7e5-2a44-4caa-883c-7ef2027e69c4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.969337 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-scripts\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.970304 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-config-data\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.971838 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.980249 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8af7e5-2a44-4caa-883c-7ef2027e69c4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.980267 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f22bff9a-80b3-41a7-9f0d-337b7c61b46a" (UID: "f22bff9a-80b3-41a7-9f0d-337b7c61b46a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.986987 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5c65\" (UniqueName: \"kubernetes.io/projected/de8af7e5-2a44-4caa-883c-7ef2027e69c4-kube-api-access-m5c65\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:56 crc kubenswrapper[4803]: I0320 17:35:56.994878 4803 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.007469 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-config-data" (OuterVolumeSpecName: "config-data") pod "f22bff9a-80b3-41a7-9f0d-337b7c61b46a" (UID: "f22bff9a-80b3-41a7-9f0d-337b7c61b46a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.047721 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"de8af7e5-2a44-4caa-883c-7ef2027e69c4\") " pod="openstack/glance-default-external-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.056399 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f22bff9a-80b3-41a7-9f0d-337b7c61b46a" (UID: "f22bff9a-80b3-41a7-9f0d-337b7c61b46a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.060566 4803 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.060591 4803 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.060602 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.060610 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f22bff9a-80b3-41a7-9f0d-337b7c61b46a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.358576 4803 generic.go:334] "Generic (PLEG): container finished" podID="a45b6fb6-a659-4a13-96d8-2f41634e3423" containerID="dc02b17a721a0e87422aea593e3725e0e205b4241ffe30bf4af60d3195f1f298" exitCode=0 Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.361087 4803 generic.go:334] "Generic (PLEG): container finished" podID="ae48d528-5867-44e8-a58b-7a035fd1c0a7" containerID="65dca507c9ef24dc14925fd13203843ca6d0e038d28c89b10b63ce182f014c11" exitCode=0 Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.363697 4803 generic.go:334] "Generic (PLEG): container finished" podID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerID="118e9705aaa8a134f7b51ee635781e210861eb775bae53298f2aeeb57f80314c" exitCode=137 Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.365103 4803 generic.go:334] "Generic (PLEG): container finished" podID="ad456b82-4bac-43be-b409-afcea1169a3a" containerID="93eac6cd2687b29e56cb8aa14dd3ff285c9adbd212834dd76ce8b20dddf1148d" exitCode=0 Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.381323 4803 generic.go:334] "Generic (PLEG): container finished" podID="e7fbac44-b743-4049-ae33-92d8fdb12dfd" containerID="8c57888648f8a3e10ca03931c4b38c398f05972b9e25aa40b62bd813a26a6b03" exitCode=0 Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.383395 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449420 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t2hrj" event={"ID":"a45b6fb6-a659-4a13-96d8-2f41634e3423","Type":"ContainerDied","Data":"dc02b17a721a0e87422aea593e3725e0e205b4241ffe30bf4af60d3195f1f298"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449480 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t2hrj" event={"ID":"a45b6fb6-a659-4a13-96d8-2f41634e3423","Type":"ContainerStarted","Data":"0de152469403789f65c39bd255efc97fb2c63eb974c6eca01725ea343ca183a2"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449491 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9zd4x" event={"ID":"ae48d528-5867-44e8-a58b-7a035fd1c0a7","Type":"ContainerDied","Data":"65dca507c9ef24dc14925fd13203843ca6d0e038d28c89b10b63ce182f014c11"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449502 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9zd4x" event={"ID":"ae48d528-5867-44e8-a58b-7a035fd1c0a7","Type":"ContainerStarted","Data":"0f033bdad8bd94e09f76c2c3c786e2d1359a9337ef788e14e8e766976decd1d7"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449512 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b65b9966-sh4pn" event={"ID":"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed","Type":"ContainerDied","Data":"118e9705aaa8a134f7b51ee635781e210861eb775bae53298f2aeeb57f80314c"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449538 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-75b65b9966-sh4pn" event={"ID":"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed","Type":"ContainerDied","Data":"71b1be04cf2cb2930232b5e2dab15265d7f68dc9d126f6320eb3853e9ba1b642"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449549 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71b1be04cf2cb2930232b5e2dab15265d7f68dc9d126f6320eb3853e9ba1b642" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449566 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" event={"ID":"ad456b82-4bac-43be-b409-afcea1169a3a","Type":"ContainerDied","Data":"93eac6cd2687b29e56cb8aa14dd3ff285c9adbd212834dd76ce8b20dddf1148d"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449577 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" event={"ID":"ad456b82-4bac-43be-b409-afcea1169a3a","Type":"ContainerStarted","Data":"06dad16cde2c1ba53f9d0719ac9d703ee39746461f444e2e04e6b1c2e710df8d"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449604 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-da07-account-create-update-9c6n2" event={"ID":"b8fd306c-b4b4-4f8a-812c-4434756c38dc","Type":"ContainerStarted","Data":"bc6a32493240a4896f243809ff6679a83d07f28daf93d3915e8e960e12772231"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449615 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-055e-account-create-update-qzqpx" event={"ID":"f75c75d4-5e6d-40b0-831a-4114a94c6c64","Type":"ContainerStarted","Data":"c2222385c4493ff53ee30966f99bc0f666d565d2e1611fb26667c31a6c12c064"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449625 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerStarted","Data":"8d58bb1d19a9f117ae850ec6940fe10eb19ae00a1ae5ce5452b91762104cfd9c"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449637 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x8dss" event={"ID":"e7fbac44-b743-4049-ae33-92d8fdb12dfd","Type":"ContainerDied","Data":"8c57888648f8a3e10ca03931c4b38c398f05972b9e25aa40b62bd813a26a6b03"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449655 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x8dss" event={"ID":"e7fbac44-b743-4049-ae33-92d8fdb12dfd","Type":"ContainerStarted","Data":"c84c16fa92c1434375df370bed1ba0e81b7c63cbd4c9e71c840167c0f33862b6"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449665 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f22bff9a-80b3-41a7-9f0d-337b7c61b46a","Type":"ContainerDied","Data":"93e99510fbb0104bf49c97406e8d9871b96dea2647d12a473e268f7f79abc7e3"} Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.449687 4803 scope.go:117] "RemoveContainer" containerID="c3e79caec9548dc0bbc047750096a62108fde9eb28fbc970e8a22062d4aad3b6" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.472005 4803 scope.go:117] "RemoveContainer" containerID="df685bb06f31c910c8919b80d9b49a695a69dca779ad626488ca56799fd989e1" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.482687 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.492408 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.508089 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.518044 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.525165 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:35:57 crc kubenswrapper[4803]: E0320 17:35:57.525520 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.525544 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon" Mar 20 17:35:57 crc kubenswrapper[4803]: E0320 17:35:57.525577 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon-log" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.525583 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon-log" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.525755 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon-log" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.525767 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" containerName="horizon" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.527939 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.538144 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.538571 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.548722 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.570236 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-secret-key\") pod \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.571656 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-tls-certs\") pod \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.572064 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-logs\") pod \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.572331 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-combined-ca-bundle\") pod \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.572675 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-config-data\") pod \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.572798 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-logs" (OuterVolumeSpecName: "logs") pod "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" (UID: "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.577328 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwckr\" (UniqueName: \"kubernetes.io/projected/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-kube-api-access-fwckr\") pod \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.577421 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-scripts\") pod \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\" (UID: \"5fa5f0f8-cc68-400d-8570-df60f8d8c7ed\") " Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.579707 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.582476 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27ce9fd0-5867-4d7f-aec4-70784b898289-logs\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.587594 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzmsr\" (UniqueName: \"kubernetes.io/projected/27ce9fd0-5867-4d7f-aec4-70784b898289-kube-api-access-nzmsr\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.587671 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-scripts\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.587698 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-config-data\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.587757 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27ce9fd0-5867-4d7f-aec4-70784b898289-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.587825 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.587871 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.587896 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.587959 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.584655 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5598996667-w8j76" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.584698 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" (UID: "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.584821 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-kube-api-access-fwckr" (OuterVolumeSpecName: "kube-api-access-fwckr") pod "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" (UID: "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed"). InnerVolumeSpecName "kube-api-access-fwckr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.619070 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-config-data" (OuterVolumeSpecName: "config-data") pod "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" (UID: "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.662649 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-scripts" (OuterVolumeSpecName: "scripts") pod "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" (UID: "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.690097 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27ce9fd0-5867-4d7f-aec4-70784b898289-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.690187 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.690246 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.690285 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.690342 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27ce9fd0-5867-4d7f-aec4-70784b898289-logs\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.691485 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzmsr\" (UniqueName: \"kubernetes.io/projected/27ce9fd0-5867-4d7f-aec4-70784b898289-kube-api-access-nzmsr\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.691576 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-scripts\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.691602 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-config-data\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.691664 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.691678 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwckr\" (UniqueName: \"kubernetes.io/projected/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-kube-api-access-fwckr\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.691690 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.691702 4803 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.691985 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.693156 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27ce9fd0-5867-4d7f-aec4-70784b898289-logs\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.694138 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/27ce9fd0-5867-4d7f-aec4-70784b898289-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.694181 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.700821 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-scripts\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.702008 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" (UID: "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.702326 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.705022 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27ce9fd0-5867-4d7f-aec4-70784b898289-config-data\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.716900 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzmsr\" (UniqueName: \"kubernetes.io/projected/27ce9fd0-5867-4d7f-aec4-70784b898289-kube-api-access-nzmsr\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.721251 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" (UID: "5fa5f0f8-cc68-400d-8570-df60f8d8c7ed"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.754689 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"27ce9fd0-5867-4d7f-aec4-70784b898289\") " pod="openstack/glance-default-internal-api-0" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.796882 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.796915 4803 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:57 crc kubenswrapper[4803]: I0320 17:35:57.880125 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.160815 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.337591 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.430082 4803 generic.go:334] "Generic (PLEG): container finished" podID="f75c75d4-5e6d-40b0-831a-4114a94c6c64" containerID="34f20c260996c5a08bf4fdfa8d1eb8d690278115a62dfd517d0b01139f2496d6" exitCode=0 Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.430189 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-055e-account-create-update-qzqpx" event={"ID":"f75c75d4-5e6d-40b0-831a-4114a94c6c64","Type":"ContainerDied","Data":"34f20c260996c5a08bf4fdfa8d1eb8d690278115a62dfd517d0b01139f2496d6"} Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.433828 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerStarted","Data":"8804995444a45fc34f7151c5b2ea8602b283cddcc234f4bf6e854a9858395f76"} Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.433860 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerStarted","Data":"0c635990aa324eaad4e42fa0922005b5d2abbde11a3e013289a078c83f1c77cf"} Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.437478 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"27ce9fd0-5867-4d7f-aec4-70784b898289","Type":"ContainerStarted","Data":"a03e9c579d0aff29bc6270992cff9a40f74b230a434460caebaf2dbe5515beeb"} Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.441003 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"de8af7e5-2a44-4caa-883c-7ef2027e69c4","Type":"ContainerStarted","Data":"a61565c9bab38855bdbfddd591dfaa14890aab10cab65a0875e92c46940e4b03"} Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.443131 4803 generic.go:334] "Generic (PLEG): container finished" podID="b8fd306c-b4b4-4f8a-812c-4434756c38dc" containerID="4d7d09ed102210b78de59f469432ae289aeec5293a29bb3cc365a921ee93281f" exitCode=0 Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.444507 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-da07-account-create-update-9c6n2" event={"ID":"b8fd306c-b4b4-4f8a-812c-4434756c38dc","Type":"ContainerDied","Data":"4d7d09ed102210b78de59f469432ae289aeec5293a29bb3cc365a921ee93281f"} Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.444591 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-75b65b9966-sh4pn" Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.491797 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-75b65b9966-sh4pn"] Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.519235 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-75b65b9966-sh4pn"] Mar 20 17:35:58 crc kubenswrapper[4803]: E0320 17:35:58.738048 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e954db_f75f_4ab2_93af_c3f0738748ac.slice/crio-6ab2cf5f45350281da36b5c6137444d215cbfd248825c7960db825bb0ae4aaa9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice/crio-6604352edbf5672477a53f226bfa7a69c5155e198737ed69726600ee44b8d7fe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod824aff30_6d5e_4489_bf27_79910aafe31e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod824aff30_6d5e_4489_bf27_79910aafe31e.slice/crio-cb5be6fb0d30073e29afd2569eac9ec86bda75041effcb358329f79dbd107482\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fa5f0f8_cc68_400d_8570_df60f8d8c7ed.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e954db_f75f_4ab2_93af_c3f0738748ac.slice/crio-af5b4baeba5181ab821cd4086bfa7f3a4696767e199b4932917faa1367122571.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf22bff9a_80b3_41a7_9f0d_337b7c61b46a.slice/crio-df685bb06f31c910c8919b80d9b49a695a69dca779ad626488ca56799fd989e1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fa5f0f8_cc68_400d_8570_df60f8d8c7ed.slice/crio-conmon-118e9705aaa8a134f7b51ee635781e210861eb775bae53298f2aeeb57f80314c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf22bff9a_80b3_41a7_9f0d_337b7c61b46a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice/crio-conmon-bff37b47bc7555f549b0c6203954ec0d02e36d7a6dba536236b062330184be15.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod824aff30_6d5e_4489_bf27_79910aafe31e.slice/crio-f0240cdb4bc1cd10ecc35ba3f331154b3247f37c1243547d2fa84fce4534a75a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce4ab454_b5e3_435e_b406_ee5891a82b69.slice/crio-conmon-3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e954db_f75f_4ab2_93af_c3f0738748ac.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice/crio-conmon-4b7680dd7ce3f95d2239bf669dacd6a291aef4b734cafb33bbfd0cb3b924ad73.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice/crio-af3018a478f58bc49449e6e7cba6f378481703f4bc819c22afd93d43e3228f30.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fa5f0f8_cc68_400d_8570_df60f8d8c7ed.slice/crio-71b1be04cf2cb2930232b5e2dab15265d7f68dc9d126f6320eb3853e9ba1b642\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce4ab454_b5e3_435e_b406_ee5891a82b69.slice/crio-conmon-742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice/crio-bff37b47bc7555f549b0c6203954ec0d02e36d7a6dba536236b062330184be15.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf22bff9a_80b3_41a7_9f0d_337b7c61b46a.slice/crio-c3e79caec9548dc0bbc047750096a62108fde9eb28fbc970e8a22062d4aad3b6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e954db_f75f_4ab2_93af_c3f0738748ac.slice/crio-fb545e96b631a2d935922a41a38694bf2f88a00a41527992a85d2e3be6b73b3b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e954db_f75f_4ab2_93af_c3f0738748ac.slice/crio-conmon-6ab2cf5f45350281da36b5c6137444d215cbfd248825c7960db825bb0ae4aaa9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice/crio-4b7680dd7ce3f95d2239bf669dacd6a291aef4b734cafb33bbfd0cb3b924ad73.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7e954db_f75f_4ab2_93af_c3f0738748ac.slice/crio-conmon-af5b4baeba5181ab821cd4086bfa7f3a4696767e199b4932917faa1367122571.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fa5f0f8_cc68_400d_8570_df60f8d8c7ed.slice/crio-118e9705aaa8a134f7b51ee635781e210861eb775bae53298f2aeeb57f80314c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice/crio-936bb04a57759b534c0f972fdb52034feee0bcf9146e96363e79ff604490434a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d5fb2c1_3925_4353_a0f9_c3e00ac43b91.slice/crio-conmon-af3018a478f58bc49449e6e7cba6f378481703f4bc819c22afd93d43e3228f30.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf22bff9a_80b3_41a7_9f0d_337b7c61b46a.slice/crio-conmon-df685bb06f31c910c8919b80d9b49a695a69dca779ad626488ca56799fd989e1.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.864708 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa5f0f8-cc68-400d-8570-df60f8d8c7ed" path="/var/lib/kubelet/pods/5fa5f0f8-cc68-400d-8570-df60f8d8c7ed/volumes" Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.865775 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f22bff9a-80b3-41a7-9f0d-337b7c61b46a" path="/var/lib/kubelet/pods/f22bff9a-80b3-41a7-9f0d-337b7c61b46a/volumes" Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.876967 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.931681 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkrs2\" (UniqueName: \"kubernetes.io/projected/ae48d528-5867-44e8-a58b-7a035fd1c0a7-kube-api-access-rkrs2\") pod \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\" (UID: \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\") " Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.931755 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae48d528-5867-44e8-a58b-7a035fd1c0a7-operator-scripts\") pod \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\" (UID: \"ae48d528-5867-44e8-a58b-7a035fd1c0a7\") " Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.933604 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae48d528-5867-44e8-a58b-7a035fd1c0a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae48d528-5867-44e8-a58b-7a035fd1c0a7" (UID: "ae48d528-5867-44e8-a58b-7a035fd1c0a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:58 crc kubenswrapper[4803]: I0320 17:35:58.990913 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae48d528-5867-44e8-a58b-7a035fd1c0a7-kube-api-access-rkrs2" (OuterVolumeSpecName: "kube-api-access-rkrs2") pod "ae48d528-5867-44e8-a58b-7a035fd1c0a7" (UID: "ae48d528-5867-44e8-a58b-7a035fd1c0a7"). InnerVolumeSpecName "kube-api-access-rkrs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.033443 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkrs2\" (UniqueName: \"kubernetes.io/projected/ae48d528-5867-44e8-a58b-7a035fd1c0a7-kube-api-access-rkrs2\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.033471 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae48d528-5867-44e8-a58b-7a035fd1c0a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.122132 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.129349 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.134261 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdmsw\" (UniqueName: \"kubernetes.io/projected/a45b6fb6-a659-4a13-96d8-2f41634e3423-kube-api-access-tdmsw\") pod \"a45b6fb6-a659-4a13-96d8-2f41634e3423\" (UID: \"a45b6fb6-a659-4a13-96d8-2f41634e3423\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.134302 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45b6fb6-a659-4a13-96d8-2f41634e3423-operator-scripts\") pod \"a45b6fb6-a659-4a13-96d8-2f41634e3423\" (UID: \"a45b6fb6-a659-4a13-96d8-2f41634e3423\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.134402 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpb6c\" (UniqueName: \"kubernetes.io/projected/e7fbac44-b743-4049-ae33-92d8fdb12dfd-kube-api-access-jpb6c\") pod \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\" (UID: \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.134458 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbac44-b743-4049-ae33-92d8fdb12dfd-operator-scripts\") pod \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\" (UID: \"e7fbac44-b743-4049-ae33-92d8fdb12dfd\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.135386 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7fbac44-b743-4049-ae33-92d8fdb12dfd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7fbac44-b743-4049-ae33-92d8fdb12dfd" (UID: "e7fbac44-b743-4049-ae33-92d8fdb12dfd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.136339 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a45b6fb6-a659-4a13-96d8-2f41634e3423-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a45b6fb6-a659-4a13-96d8-2f41634e3423" (UID: "a45b6fb6-a659-4a13-96d8-2f41634e3423"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.139618 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7fbac44-b743-4049-ae33-92d8fdb12dfd-kube-api-access-jpb6c" (OuterVolumeSpecName: "kube-api-access-jpb6c") pod "e7fbac44-b743-4049-ae33-92d8fdb12dfd" (UID: "e7fbac44-b743-4049-ae33-92d8fdb12dfd"). InnerVolumeSpecName "kube-api-access-jpb6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.140888 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a45b6fb6-a659-4a13-96d8-2f41634e3423-kube-api-access-tdmsw" (OuterVolumeSpecName: "kube-api-access-tdmsw") pod "a45b6fb6-a659-4a13-96d8-2f41634e3423" (UID: "a45b6fb6-a659-4a13-96d8-2f41634e3423"). InnerVolumeSpecName "kube-api-access-tdmsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.171487 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.186865 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.236119 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a45b6fb6-a659-4a13-96d8-2f41634e3423-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.236149 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpb6c\" (UniqueName: \"kubernetes.io/projected/e7fbac44-b743-4049-ae33-92d8fdb12dfd-kube-api-access-jpb6c\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.236161 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7fbac44-b743-4049-ae33-92d8fdb12dfd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.236171 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdmsw\" (UniqueName: \"kubernetes.io/projected/a45b6fb6-a659-4a13-96d8-2f41634e3423-kube-api-access-tdmsw\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.338237 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx7f7\" (UniqueName: \"kubernetes.io/projected/ce4ab454-b5e3-435e-b406-ee5891a82b69-kube-api-access-jx7f7\") pod \"ce4ab454-b5e3-435e-b406-ee5891a82b69\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.338392 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-httpd-config\") pod \"ce4ab454-b5e3-435e-b406-ee5891a82b69\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.338474 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh7km\" (UniqueName: \"kubernetes.io/projected/ad456b82-4bac-43be-b409-afcea1169a3a-kube-api-access-sh7km\") pod \"ad456b82-4bac-43be-b409-afcea1169a3a\" (UID: \"ad456b82-4bac-43be-b409-afcea1169a3a\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.338601 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-ovndb-tls-certs\") pod \"ce4ab454-b5e3-435e-b406-ee5891a82b69\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.338652 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-config\") pod \"ce4ab454-b5e3-435e-b406-ee5891a82b69\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.338834 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad456b82-4bac-43be-b409-afcea1169a3a-operator-scripts\") pod \"ad456b82-4bac-43be-b409-afcea1169a3a\" (UID: \"ad456b82-4bac-43be-b409-afcea1169a3a\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.338933 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-combined-ca-bundle\") pod \"ce4ab454-b5e3-435e-b406-ee5891a82b69\" (UID: \"ce4ab454-b5e3-435e-b406-ee5891a82b69\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.343104 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad456b82-4bac-43be-b409-afcea1169a3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad456b82-4bac-43be-b409-afcea1169a3a" (UID: "ad456b82-4bac-43be-b409-afcea1169a3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.347130 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4ab454-b5e3-435e-b406-ee5891a82b69-kube-api-access-jx7f7" (OuterVolumeSpecName: "kube-api-access-jx7f7") pod "ce4ab454-b5e3-435e-b406-ee5891a82b69" (UID: "ce4ab454-b5e3-435e-b406-ee5891a82b69"). InnerVolumeSpecName "kube-api-access-jx7f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.347238 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ce4ab454-b5e3-435e-b406-ee5891a82b69" (UID: "ce4ab454-b5e3-435e-b406-ee5891a82b69"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.361669 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad456b82-4bac-43be-b409-afcea1169a3a-kube-api-access-sh7km" (OuterVolumeSpecName: "kube-api-access-sh7km") pod "ad456b82-4bac-43be-b409-afcea1169a3a" (UID: "ad456b82-4bac-43be-b409-afcea1169a3a"). InnerVolumeSpecName "kube-api-access-sh7km". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.431676 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce4ab454-b5e3-435e-b406-ee5891a82b69" (UID: "ce4ab454-b5e3-435e-b406-ee5891a82b69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.440426 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad456b82-4bac-43be-b409-afcea1169a3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.440453 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.440463 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx7f7\" (UniqueName: \"kubernetes.io/projected/ce4ab454-b5e3-435e-b406-ee5891a82b69-kube-api-access-jx7f7\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.440472 4803 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.440481 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh7km\" (UniqueName: \"kubernetes.io/projected/ad456b82-4bac-43be-b409-afcea1169a3a-kube-api-access-sh7km\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.454431 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-config" (OuterVolumeSpecName: "config") pod "ce4ab454-b5e3-435e-b406-ee5891a82b69" (UID: "ce4ab454-b5e3-435e-b406-ee5891a82b69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.467437 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-x8dss" event={"ID":"e7fbac44-b743-4049-ae33-92d8fdb12dfd","Type":"ContainerDied","Data":"c84c16fa92c1434375df370bed1ba0e81b7c63cbd4c9e71c840167c0f33862b6"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.467875 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c84c16fa92c1434375df370bed1ba0e81b7c63cbd4c9e71c840167c0f33862b6" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.467472 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-x8dss" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.471885 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" event={"ID":"ad456b82-4bac-43be-b409-afcea1169a3a","Type":"ContainerDied","Data":"06dad16cde2c1ba53f9d0719ac9d703ee39746461f444e2e04e6b1c2e710df8d"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.471920 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06dad16cde2c1ba53f9d0719ac9d703ee39746461f444e2e04e6b1c2e710df8d" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.471962 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ded6-account-create-update-b2sc5" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.476509 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ce4ab454-b5e3-435e-b406-ee5891a82b69" (UID: "ce4ab454-b5e3-435e-b406-ee5891a82b69"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.483822 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"27ce9fd0-5867-4d7f-aec4-70784b898289","Type":"ContainerStarted","Data":"921a3c4b5352f259f264c0c8f9d1e121db18dfce1ca459872f68a4a149378f48"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.498101 4803 generic.go:334] "Generic (PLEG): container finished" podID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerID="3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e" exitCode=0 Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.498136 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-799cb79944-p24nt" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.498142 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-799cb79944-p24nt" event={"ID":"ce4ab454-b5e3-435e-b406-ee5891a82b69","Type":"ContainerDied","Data":"3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.498204 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-799cb79944-p24nt" event={"ID":"ce4ab454-b5e3-435e-b406-ee5891a82b69","Type":"ContainerDied","Data":"ed79709addf6e0fc81955e356bba5d9b39283203eec242bee81945b7efc0f246"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.498236 4803 scope.go:117] "RemoveContainer" containerID="742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.501712 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"de8af7e5-2a44-4caa-883c-7ef2027e69c4","Type":"ContainerStarted","Data":"c704e1ef9e2a868829493cf14ca6248aa900d0bafec126b70968da449acb0432"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.504881 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-t2hrj" event={"ID":"a45b6fb6-a659-4a13-96d8-2f41634e3423","Type":"ContainerDied","Data":"0de152469403789f65c39bd255efc97fb2c63eb974c6eca01725ea343ca183a2"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.504919 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-t2hrj" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.504932 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de152469403789f65c39bd255efc97fb2c63eb974c6eca01725ea343ca183a2" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.510894 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerStarted","Data":"82f71af49f3e0a19db61e64ccc7aefaaaf34f629e694565ae9c235b87f14eeb4"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.515370 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9zd4x" event={"ID":"ae48d528-5867-44e8-a58b-7a035fd1c0a7","Type":"ContainerDied","Data":"0f033bdad8bd94e09f76c2c3c786e2d1359a9337ef788e14e8e766976decd1d7"} Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.515408 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f033bdad8bd94e09f76c2c3c786e2d1359a9337ef788e14e8e766976decd1d7" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.515406 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9zd4x" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.542314 4803 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.542338 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ce4ab454-b5e3-435e-b406-ee5891a82b69-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.544861 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-799cb79944-p24nt"] Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.552388 4803 scope.go:117] "RemoveContainer" containerID="3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.554713 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-799cb79944-p24nt"] Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.576035 4803 scope.go:117] "RemoveContainer" containerID="742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d" Mar 20 17:35:59 crc kubenswrapper[4803]: E0320 17:35:59.576400 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d\": container with ID starting with 742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d not found: ID does not exist" containerID="742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.576432 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d"} err="failed to get container status \"742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d\": rpc error: code = NotFound desc = could not find container \"742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d\": container with ID starting with 742f33ac76319a963b59f7cefd35e02d2f9446ca719e3f243f3fa511802bc55d not found: ID does not exist" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.576453 4803 scope.go:117] "RemoveContainer" containerID="3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e" Mar 20 17:35:59 crc kubenswrapper[4803]: E0320 17:35:59.576753 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e\": container with ID starting with 3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e not found: ID does not exist" containerID="3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.576776 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e"} err="failed to get container status \"3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e\": rpc error: code = NotFound desc = could not find container \"3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e\": container with ID starting with 3199275d1bf1ff2649f4d6dcf943570dda1cf6b8672b2021dca28dc19070b62e not found: ID does not exist" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.794963 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.950177 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8fd306c-b4b4-4f8a-812c-4434756c38dc-operator-scripts\") pod \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\" (UID: \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.950336 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njn7s\" (UniqueName: \"kubernetes.io/projected/b8fd306c-b4b4-4f8a-812c-4434756c38dc-kube-api-access-njn7s\") pod \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\" (UID: \"b8fd306c-b4b4-4f8a-812c-4434756c38dc\") " Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.951702 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8fd306c-b4b4-4f8a-812c-4434756c38dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b8fd306c-b4b4-4f8a-812c-4434756c38dc" (UID: "b8fd306c-b4b4-4f8a-812c-4434756c38dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:35:59 crc kubenswrapper[4803]: I0320 17:35:59.964311 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8fd306c-b4b4-4f8a-812c-4434756c38dc-kube-api-access-njn7s" (OuterVolumeSpecName: "kube-api-access-njn7s") pod "b8fd306c-b4b4-4f8a-812c-4434756c38dc" (UID: "b8fd306c-b4b4-4f8a-812c-4434756c38dc"). InnerVolumeSpecName "kube-api-access-njn7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.036246 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.052329 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8fd306c-b4b4-4f8a-812c-4434756c38dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.053202 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njn7s\" (UniqueName: \"kubernetes.io/projected/b8fd306c-b4b4-4f8a-812c-4434756c38dc-kube-api-access-njn7s\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132138 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567136-nkrhf"] Mar 20 17:36:00 crc kubenswrapper[4803]: E0320 17:36:00.132573 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8fd306c-b4b4-4f8a-812c-4434756c38dc" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132597 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8fd306c-b4b4-4f8a-812c-4434756c38dc" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: E0320 17:36:00.132624 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad456b82-4bac-43be-b409-afcea1169a3a" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132635 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad456b82-4bac-43be-b409-afcea1169a3a" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: E0320 17:36:00.132650 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae48d528-5867-44e8-a58b-7a035fd1c0a7" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132657 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae48d528-5867-44e8-a58b-7a035fd1c0a7" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: E0320 17:36:00.132676 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerName="neutron-api" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132682 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerName="neutron-api" Mar 20 17:36:00 crc kubenswrapper[4803]: E0320 17:36:00.132694 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fbac44-b743-4049-ae33-92d8fdb12dfd" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132702 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fbac44-b743-4049-ae33-92d8fdb12dfd" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: E0320 17:36:00.132710 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75c75d4-5e6d-40b0-831a-4114a94c6c64" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132719 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75c75d4-5e6d-40b0-831a-4114a94c6c64" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: E0320 17:36:00.132731 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerName="neutron-httpd" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132738 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerName="neutron-httpd" Mar 20 17:36:00 crc kubenswrapper[4803]: E0320 17:36:00.132750 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a45b6fb6-a659-4a13-96d8-2f41634e3423" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132757 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="a45b6fb6-a659-4a13-96d8-2f41634e3423" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.132989 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fbac44-b743-4049-ae33-92d8fdb12dfd" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.133005 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75c75d4-5e6d-40b0-831a-4114a94c6c64" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.133015 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerName="neutron-httpd" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.133026 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae48d528-5867-44e8-a58b-7a035fd1c0a7" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.133038 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" containerName="neutron-api" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.133052 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="a45b6fb6-a659-4a13-96d8-2f41634e3423" containerName="mariadb-database-create" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.133060 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad456b82-4bac-43be-b409-afcea1169a3a" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.133073 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8fd306c-b4b4-4f8a-812c-4434756c38dc" containerName="mariadb-account-create-update" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.133822 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-nkrhf" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.136112 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.136470 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.136832 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.149080 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-nkrhf"] Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.159142 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75c75d4-5e6d-40b0-831a-4114a94c6c64-operator-scripts\") pod \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\" (UID: \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\") " Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.159193 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcdcc\" (UniqueName: \"kubernetes.io/projected/f75c75d4-5e6d-40b0-831a-4114a94c6c64-kube-api-access-dcdcc\") pod \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\" (UID: \"f75c75d4-5e6d-40b0-831a-4114a94c6c64\") " Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.163039 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f75c75d4-5e6d-40b0-831a-4114a94c6c64-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f75c75d4-5e6d-40b0-831a-4114a94c6c64" (UID: "f75c75d4-5e6d-40b0-831a-4114a94c6c64"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.166563 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f75c75d4-5e6d-40b0-831a-4114a94c6c64-kube-api-access-dcdcc" (OuterVolumeSpecName: "kube-api-access-dcdcc") pod "f75c75d4-5e6d-40b0-831a-4114a94c6c64" (UID: "f75c75d4-5e6d-40b0-831a-4114a94c6c64"). InnerVolumeSpecName "kube-api-access-dcdcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.261424 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph8s5\" (UniqueName: \"kubernetes.io/projected/af2e4734-6885-46ad-a064-324222e418cc-kube-api-access-ph8s5\") pod \"auto-csr-approver-29567136-nkrhf\" (UID: \"af2e4734-6885-46ad-a064-324222e418cc\") " pod="openshift-infra/auto-csr-approver-29567136-nkrhf" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.261604 4803 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f75c75d4-5e6d-40b0-831a-4114a94c6c64-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.261618 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcdcc\" (UniqueName: \"kubernetes.io/projected/f75c75d4-5e6d-40b0-831a-4114a94c6c64-kube-api-access-dcdcc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.363445 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph8s5\" (UniqueName: \"kubernetes.io/projected/af2e4734-6885-46ad-a064-324222e418cc-kube-api-access-ph8s5\") pod \"auto-csr-approver-29567136-nkrhf\" (UID: \"af2e4734-6885-46ad-a064-324222e418cc\") " pod="openshift-infra/auto-csr-approver-29567136-nkrhf" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.395272 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph8s5\" (UniqueName: \"kubernetes.io/projected/af2e4734-6885-46ad-a064-324222e418cc-kube-api-access-ph8s5\") pod \"auto-csr-approver-29567136-nkrhf\" (UID: \"af2e4734-6885-46ad-a064-324222e418cc\") " pod="openshift-infra/auto-csr-approver-29567136-nkrhf" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.448998 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-nkrhf" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.541692 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"de8af7e5-2a44-4caa-883c-7ef2027e69c4","Type":"ContainerStarted","Data":"8f93e90d5cedb607d798031f376336e377d35c5b552f621a103b3b0d8d2e93e9"} Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.569799 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.569781812 podStartE2EDuration="4.569781812s" podCreationTimestamp="2026-03-20 17:35:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:00.568426075 +0000 UTC m=+1170.480018155" watchObservedRunningTime="2026-03-20 17:36:00.569781812 +0000 UTC m=+1170.481373882" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.576307 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-da07-account-create-update-9c6n2" event={"ID":"b8fd306c-b4b4-4f8a-812c-4434756c38dc","Type":"ContainerDied","Data":"bc6a32493240a4896f243809ff6679a83d07f28daf93d3915e8e960e12772231"} Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.576347 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc6a32493240a4896f243809ff6679a83d07f28daf93d3915e8e960e12772231" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.576411 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-da07-account-create-update-9c6n2" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.578083 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-055e-account-create-update-qzqpx" event={"ID":"f75c75d4-5e6d-40b0-831a-4114a94c6c64","Type":"ContainerDied","Data":"c2222385c4493ff53ee30966f99bc0f666d565d2e1611fb26667c31a6c12c064"} Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.578177 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2222385c4493ff53ee30966f99bc0f666d565d2e1611fb26667c31a6c12c064" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.578272 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-055e-account-create-update-qzqpx" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.582824 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"27ce9fd0-5867-4d7f-aec4-70784b898289","Type":"ContainerStarted","Data":"24155974e3672c15a74c7f3624385c16d7c057270b7cab6b9ad4726a37dece2a"} Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.620249 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.620228171 podStartE2EDuration="3.620228171s" podCreationTimestamp="2026-03-20 17:35:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:00.613699355 +0000 UTC m=+1170.525291445" watchObservedRunningTime="2026-03-20 17:36:00.620228171 +0000 UTC m=+1170.531820241" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.882505 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4ab454-b5e3-435e-b406-ee5891a82b69" path="/var/lib/kubelet/pods/ce4ab454-b5e3-435e-b406-ee5891a82b69/volumes" Mar 20 17:36:00 crc kubenswrapper[4803]: I0320 17:36:00.917583 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-nkrhf"] Mar 20 17:36:01 crc kubenswrapper[4803]: I0320 17:36:01.603606 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-nkrhf" event={"ID":"af2e4734-6885-46ad-a064-324222e418cc","Type":"ContainerStarted","Data":"23290eb75c7a9507c193bdb2d914769c88c966e2eb3be1966dbe5f530b118543"} Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.623969 4803 generic.go:334] "Generic (PLEG): container finished" podID="af2e4734-6885-46ad-a064-324222e418cc" containerID="93fad68ac303ce1c7c559099d76de9a994a32245356543e5450b75e8788e0c50" exitCode=0 Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.624062 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-nkrhf" event={"ID":"af2e4734-6885-46ad-a064-324222e418cc","Type":"ContainerDied","Data":"93fad68ac303ce1c7c559099d76de9a994a32245356543e5450b75e8788e0c50"} Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.627264 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerStarted","Data":"3c9908e3082022fc2c87e5069b769e66b7990ea8af77a35783c94a9e07c49599"} Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.627438 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="ceilometer-central-agent" containerID="cri-o://0c635990aa324eaad4e42fa0922005b5d2abbde11a3e013289a078c83f1c77cf" gracePeriod=30 Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.627507 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.627561 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="ceilometer-notification-agent" containerID="cri-o://8804995444a45fc34f7151c5b2ea8602b283cddcc234f4bf6e854a9858395f76" gracePeriod=30 Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.627513 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="sg-core" containerID="cri-o://82f71af49f3e0a19db61e64ccc7aefaaaf34f629e694565ae9c235b87f14eeb4" gracePeriod=30 Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.627630 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="proxy-httpd" containerID="cri-o://3c9908e3082022fc2c87e5069b769e66b7990ea8af77a35783c94a9e07c49599" gracePeriod=30 Mar 20 17:36:03 crc kubenswrapper[4803]: I0320 17:36:03.674751 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.749989481 podStartE2EDuration="13.674729478s" podCreationTimestamp="2026-03-20 17:35:50 +0000 UTC" firstStartedPulling="2026-03-20 17:35:56.493603845 +0000 UTC m=+1166.405195915" lastFinishedPulling="2026-03-20 17:36:02.418343842 +0000 UTC m=+1172.329935912" observedRunningTime="2026-03-20 17:36:03.660380431 +0000 UTC m=+1173.571972511" watchObservedRunningTime="2026-03-20 17:36:03.674729478 +0000 UTC m=+1173.586321558" Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.642452 4803 generic.go:334] "Generic (PLEG): container finished" podID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerID="3c9908e3082022fc2c87e5069b769e66b7990ea8af77a35783c94a9e07c49599" exitCode=0 Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.642804 4803 generic.go:334] "Generic (PLEG): container finished" podID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerID="82f71af49f3e0a19db61e64ccc7aefaaaf34f629e694565ae9c235b87f14eeb4" exitCode=2 Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.642816 4803 generic.go:334] "Generic (PLEG): container finished" podID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerID="8804995444a45fc34f7151c5b2ea8602b283cddcc234f4bf6e854a9858395f76" exitCode=0 Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.642825 4803 generic.go:334] "Generic (PLEG): container finished" podID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerID="0c635990aa324eaad4e42fa0922005b5d2abbde11a3e013289a078c83f1c77cf" exitCode=0 Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.642543 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerDied","Data":"3c9908e3082022fc2c87e5069b769e66b7990ea8af77a35783c94a9e07c49599"} Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.642939 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerDied","Data":"82f71af49f3e0a19db61e64ccc7aefaaaf34f629e694565ae9c235b87f14eeb4"} Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.642970 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerDied","Data":"8804995444a45fc34f7151c5b2ea8602b283cddcc234f4bf6e854a9858395f76"} Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.642984 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerDied","Data":"0c635990aa324eaad4e42fa0922005b5d2abbde11a3e013289a078c83f1c77cf"} Mar 20 17:36:04 crc kubenswrapper[4803]: I0320 17:36:04.964282 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.049559 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-sg-core-conf-yaml\") pod \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.049689 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-combined-ca-bundle\") pod \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.049726 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-scripts\") pod \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.050901 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-config-data\") pod \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.050991 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7csq8\" (UniqueName: \"kubernetes.io/projected/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-kube-api-access-7csq8\") pod \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.051095 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-log-httpd\") pod \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.051185 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-run-httpd\") pod \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\" (UID: \"d6e3f243-5e81-4c05-8563-8a7b9494cc6d\") " Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.052060 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d6e3f243-5e81-4c05-8563-8a7b9494cc6d" (UID: "d6e3f243-5e81-4c05-8563-8a7b9494cc6d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.052404 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d6e3f243-5e81-4c05-8563-8a7b9494cc6d" (UID: "d6e3f243-5e81-4c05-8563-8a7b9494cc6d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.057972 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-scripts" (OuterVolumeSpecName: "scripts") pod "d6e3f243-5e81-4c05-8563-8a7b9494cc6d" (UID: "d6e3f243-5e81-4c05-8563-8a7b9494cc6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.062192 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-kube-api-access-7csq8" (OuterVolumeSpecName: "kube-api-access-7csq8") pod "d6e3f243-5e81-4c05-8563-8a7b9494cc6d" (UID: "d6e3f243-5e81-4c05-8563-8a7b9494cc6d"). InnerVolumeSpecName "kube-api-access-7csq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.085236 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d6e3f243-5e81-4c05-8563-8a7b9494cc6d" (UID: "d6e3f243-5e81-4c05-8563-8a7b9494cc6d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.151787 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6e3f243-5e81-4c05-8563-8a7b9494cc6d" (UID: "d6e3f243-5e81-4c05-8563-8a7b9494cc6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.153211 4803 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.153233 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.153247 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.153257 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7csq8\" (UniqueName: \"kubernetes.io/projected/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-kube-api-access-7csq8\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.153268 4803 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.153277 4803 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.174534 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-config-data" (OuterVolumeSpecName: "config-data") pod "d6e3f243-5e81-4c05-8563-8a7b9494cc6d" (UID: "d6e3f243-5e81-4c05-8563-8a7b9494cc6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.176572 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-nkrhf" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.247888 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jd6h"] Mar 20 17:36:05 crc kubenswrapper[4803]: E0320 17:36:05.248406 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="ceilometer-notification-agent" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.248470 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="ceilometer-notification-agent" Mar 20 17:36:05 crc kubenswrapper[4803]: E0320 17:36:05.248544 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="ceilometer-central-agent" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.248596 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="ceilometer-central-agent" Mar 20 17:36:05 crc kubenswrapper[4803]: E0320 17:36:05.248649 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af2e4734-6885-46ad-a064-324222e418cc" containerName="oc" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.248723 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="af2e4734-6885-46ad-a064-324222e418cc" containerName="oc" Mar 20 17:36:05 crc kubenswrapper[4803]: E0320 17:36:05.248779 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="sg-core" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.248827 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="sg-core" Mar 20 17:36:05 crc kubenswrapper[4803]: E0320 17:36:05.248887 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="proxy-httpd" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.248936 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="proxy-httpd" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.249132 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="ceilometer-notification-agent" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.249201 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="ceilometer-central-agent" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.249259 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="af2e4734-6885-46ad-a064-324222e418cc" containerName="oc" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.249318 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="sg-core" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.249374 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" containerName="proxy-httpd" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.254847 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph8s5\" (UniqueName: \"kubernetes.io/projected/af2e4734-6885-46ad-a064-324222e418cc-kube-api-access-ph8s5\") pod \"af2e4734-6885-46ad-a064-324222e418cc\" (UID: \"af2e4734-6885-46ad-a064-324222e418cc\") " Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.255242 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6e3f243-5e81-4c05-8563-8a7b9494cc6d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.257439 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.259357 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-246vn" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.259977 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.260829 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.267185 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af2e4734-6885-46ad-a064-324222e418cc-kube-api-access-ph8s5" (OuterVolumeSpecName: "kube-api-access-ph8s5") pod "af2e4734-6885-46ad-a064-324222e418cc" (UID: "af2e4734-6885-46ad-a064-324222e418cc"). InnerVolumeSpecName "kube-api-access-ph8s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.286758 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jd6h"] Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.361546 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.361609 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-scripts\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.361649 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-config-data\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.361710 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk895\" (UniqueName: \"kubernetes.io/projected/c9eefa09-fd26-4111-a6e7-605629e9192c-kube-api-access-pk895\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.361811 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph8s5\" (UniqueName: \"kubernetes.io/projected/af2e4734-6885-46ad-a064-324222e418cc-kube-api-access-ph8s5\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.463105 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-scripts\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.463151 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-config-data\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.463216 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk895\" (UniqueName: \"kubernetes.io/projected/c9eefa09-fd26-4111-a6e7-605629e9192c-kube-api-access-pk895\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.463292 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.466613 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-scripts\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.466672 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-config-data\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.467551 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.478384 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk895\" (UniqueName: \"kubernetes.io/projected/c9eefa09-fd26-4111-a6e7-605629e9192c-kube-api-access-pk895\") pod \"nova-cell0-conductor-db-sync-4jd6h\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.587489 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.666133 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567136-nkrhf" event={"ID":"af2e4734-6885-46ad-a064-324222e418cc","Type":"ContainerDied","Data":"23290eb75c7a9507c193bdb2d914769c88c966e2eb3be1966dbe5f530b118543"} Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.666160 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567136-nkrhf" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.666179 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23290eb75c7a9507c193bdb2d914769c88c966e2eb3be1966dbe5f530b118543" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.670364 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d6e3f243-5e81-4c05-8563-8a7b9494cc6d","Type":"ContainerDied","Data":"8d58bb1d19a9f117ae850ec6940fe10eb19ae00a1ae5ce5452b91762104cfd9c"} Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.670401 4803 scope.go:117] "RemoveContainer" containerID="3c9908e3082022fc2c87e5069b769e66b7990ea8af77a35783c94a9e07c49599" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.670554 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.713709 4803 scope.go:117] "RemoveContainer" containerID="82f71af49f3e0a19db61e64ccc7aefaaaf34f629e694565ae9c235b87f14eeb4" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.746687 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.754292 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.777369 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.779322 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.783133 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.783287 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.788066 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.790499 4803 scope.go:117] "RemoveContainer" containerID="8804995444a45fc34f7151c5b2ea8602b283cddcc234f4bf6e854a9858395f76" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.844190 4803 scope.go:117] "RemoveContainer" containerID="0c635990aa324eaad4e42fa0922005b5d2abbde11a3e013289a078c83f1c77cf" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.868266 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-config-data\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.868306 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.868330 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-log-httpd\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.868372 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-scripts\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.868412 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2htlq\" (UniqueName: \"kubernetes.io/projected/67003746-1204-4243-97c3-303e66e01d36-kube-api-access-2htlq\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.868448 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.868469 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-run-httpd\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.969608 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2htlq\" (UniqueName: \"kubernetes.io/projected/67003746-1204-4243-97c3-303e66e01d36-kube-api-access-2htlq\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.969715 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.969759 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-run-httpd\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.969878 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-config-data\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.969929 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.969961 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-log-httpd\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.970047 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-scripts\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.970205 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-run-httpd\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.970790 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-log-httpd\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.976000 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.976149 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-scripts\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.976289 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-config-data\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.980738 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:05 crc kubenswrapper[4803]: I0320 17:36:05.995127 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2htlq\" (UniqueName: \"kubernetes.io/projected/67003746-1204-4243-97c3-303e66e01d36-kube-api-access-2htlq\") pod \"ceilometer-0\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " pod="openstack/ceilometer-0" Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.101154 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.110782 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jd6h"] Mar 20 17:36:06 crc kubenswrapper[4803]: W0320 17:36:06.126051 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9eefa09_fd26_4111_a6e7_605629e9192c.slice/crio-8cfebf94d5468da26653e5b0d94f51d1d62cfc26c33f68ab58b261b0c1a24388 WatchSource:0}: Error finding container 8cfebf94d5468da26653e5b0d94f51d1d62cfc26c33f68ab58b261b0c1a24388: Status 404 returned error can't find the container with id 8cfebf94d5468da26653e5b0d94f51d1d62cfc26c33f68ab58b261b0c1a24388 Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.249883 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-fhsx6"] Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.259829 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567130-fhsx6"] Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.529765 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:06 crc kubenswrapper[4803]: W0320 17:36:06.534096 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67003746_1204_4243_97c3_303e66e01d36.slice/crio-c5cf511c85a7bf9ac0a87dcfd08f5fa18ec93e44955d561b274c580c9cdf68e5 WatchSource:0}: Error finding container c5cf511c85a7bf9ac0a87dcfd08f5fa18ec93e44955d561b274c580c9cdf68e5: Status 404 returned error can't find the container with id c5cf511c85a7bf9ac0a87dcfd08f5fa18ec93e44955d561b274c580c9cdf68e5 Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.680897 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" event={"ID":"c9eefa09-fd26-4111-a6e7-605629e9192c","Type":"ContainerStarted","Data":"8cfebf94d5468da26653e5b0d94f51d1d62cfc26c33f68ab58b261b0c1a24388"} Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.684115 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerStarted","Data":"c5cf511c85a7bf9ac0a87dcfd08f5fa18ec93e44955d561b274c580c9cdf68e5"} Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.859822 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96522cae-d723-456d-be53-43bcdcda0146" path="/var/lib/kubelet/pods/96522cae-d723-456d-be53-43bcdcda0146/volumes" Mar 20 17:36:06 crc kubenswrapper[4803]: I0320 17:36:06.863824 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e3f243-5e81-4c05-8563-8a7b9494cc6d" path="/var/lib/kubelet/pods/d6e3f243-5e81-4c05-8563-8a7b9494cc6d/volumes" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.486265 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.486599 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.516989 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.533795 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.695616 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerStarted","Data":"edcc4d4b9e240dd16c1b628de76c9676190d27b6018ebdfadbbfd42517922167"} Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.696178 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.696211 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.881751 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.881796 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.933804 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:36:07 crc kubenswrapper[4803]: I0320 17:36:07.948854 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 17:36:08 crc kubenswrapper[4803]: I0320 17:36:08.245620 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:36:08 crc kubenswrapper[4803]: I0320 17:36:08.245704 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:36:08 crc kubenswrapper[4803]: I0320 17:36:08.717938 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerStarted","Data":"228032a6f3fa52eee9c62a5bde539d4ff41641ab02bed721dd477fe67806451e"} Mar 20 17:36:08 crc kubenswrapper[4803]: I0320 17:36:08.718444 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:36:08 crc kubenswrapper[4803]: I0320 17:36:08.718482 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 17:36:09 crc kubenswrapper[4803]: I0320 17:36:09.574337 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:09 crc kubenswrapper[4803]: I0320 17:36:09.676324 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:36:09 crc kubenswrapper[4803]: I0320 17:36:09.685083 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 17:36:10 crc kubenswrapper[4803]: I0320 17:36:10.630581 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:36:10 crc kubenswrapper[4803]: I0320 17:36:10.670075 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 17:36:10 crc kubenswrapper[4803]: I0320 17:36:10.749149 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerStarted","Data":"4b3b6cc72ee6e4dac36de867cdbd28e6237c568dbfeda8457a36d3763f0d111d"} Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.807511 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" event={"ID":"c9eefa09-fd26-4111-a6e7-605629e9192c","Type":"ContainerStarted","Data":"0c2492519b69d5e624e7e1bfcdac8a3488889d3241da936111a3347141004dd1"} Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.810966 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerStarted","Data":"c30a8e3a3b072733209262bfd70afbfea55bc52ad7652c1d07f799ee6f6ed924"} Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.811083 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="ceilometer-central-agent" containerID="cri-o://edcc4d4b9e240dd16c1b628de76c9676190d27b6018ebdfadbbfd42517922167" gracePeriod=30 Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.811321 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.811362 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="proxy-httpd" containerID="cri-o://c30a8e3a3b072733209262bfd70afbfea55bc52ad7652c1d07f799ee6f6ed924" gracePeriod=30 Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.811406 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="sg-core" containerID="cri-o://4b3b6cc72ee6e4dac36de867cdbd28e6237c568dbfeda8457a36d3763f0d111d" gracePeriod=30 Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.811442 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="ceilometer-notification-agent" containerID="cri-o://228032a6f3fa52eee9c62a5bde539d4ff41641ab02bed721dd477fe67806451e" gracePeriod=30 Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.841881 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" podStartSLOduration=2.45840066 podStartE2EDuration="11.841859018s" podCreationTimestamp="2026-03-20 17:36:05 +0000 UTC" firstStartedPulling="2026-03-20 17:36:06.128480882 +0000 UTC m=+1176.040072952" lastFinishedPulling="2026-03-20 17:36:15.51193921 +0000 UTC m=+1185.423531310" observedRunningTime="2026-03-20 17:36:16.834335315 +0000 UTC m=+1186.745927395" watchObservedRunningTime="2026-03-20 17:36:16.841859018 +0000 UTC m=+1186.753451098" Mar 20 17:36:16 crc kubenswrapper[4803]: I0320 17:36:16.876373 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.902110933 podStartE2EDuration="11.876350147s" podCreationTimestamp="2026-03-20 17:36:05 +0000 UTC" firstStartedPulling="2026-03-20 17:36:06.536774761 +0000 UTC m=+1176.448366871" lastFinishedPulling="2026-03-20 17:36:15.511014015 +0000 UTC m=+1185.422606085" observedRunningTime="2026-03-20 17:36:16.870836878 +0000 UTC m=+1186.782428988" watchObservedRunningTime="2026-03-20 17:36:16.876350147 +0000 UTC m=+1186.787942237" Mar 20 17:36:17 crc kubenswrapper[4803]: I0320 17:36:17.822510 4803 generic.go:334] "Generic (PLEG): container finished" podID="67003746-1204-4243-97c3-303e66e01d36" containerID="c30a8e3a3b072733209262bfd70afbfea55bc52ad7652c1d07f799ee6f6ed924" exitCode=0 Mar 20 17:36:17 crc kubenswrapper[4803]: I0320 17:36:17.822879 4803 generic.go:334] "Generic (PLEG): container finished" podID="67003746-1204-4243-97c3-303e66e01d36" containerID="4b3b6cc72ee6e4dac36de867cdbd28e6237c568dbfeda8457a36d3763f0d111d" exitCode=2 Mar 20 17:36:17 crc kubenswrapper[4803]: I0320 17:36:17.822891 4803 generic.go:334] "Generic (PLEG): container finished" podID="67003746-1204-4243-97c3-303e66e01d36" containerID="228032a6f3fa52eee9c62a5bde539d4ff41641ab02bed721dd477fe67806451e" exitCode=0 Mar 20 17:36:17 crc kubenswrapper[4803]: I0320 17:36:17.822903 4803 generic.go:334] "Generic (PLEG): container finished" podID="67003746-1204-4243-97c3-303e66e01d36" containerID="edcc4d4b9e240dd16c1b628de76c9676190d27b6018ebdfadbbfd42517922167" exitCode=0 Mar 20 17:36:17 crc kubenswrapper[4803]: I0320 17:36:17.822598 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerDied","Data":"c30a8e3a3b072733209262bfd70afbfea55bc52ad7652c1d07f799ee6f6ed924"} Mar 20 17:36:17 crc kubenswrapper[4803]: I0320 17:36:17.823042 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerDied","Data":"4b3b6cc72ee6e4dac36de867cdbd28e6237c568dbfeda8457a36d3763f0d111d"} Mar 20 17:36:17 crc kubenswrapper[4803]: I0320 17:36:17.823065 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerDied","Data":"228032a6f3fa52eee9c62a5bde539d4ff41641ab02bed721dd477fe67806451e"} Mar 20 17:36:17 crc kubenswrapper[4803]: I0320 17:36:17.823082 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerDied","Data":"edcc4d4b9e240dd16c1b628de76c9676190d27b6018ebdfadbbfd42517922167"} Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.346625 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.399205 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-log-httpd\") pod \"67003746-1204-4243-97c3-303e66e01d36\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.399252 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-run-httpd\") pod \"67003746-1204-4243-97c3-303e66e01d36\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.399358 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-scripts\") pod \"67003746-1204-4243-97c3-303e66e01d36\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.399376 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-combined-ca-bundle\") pod \"67003746-1204-4243-97c3-303e66e01d36\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.399432 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-config-data\") pod \"67003746-1204-4243-97c3-303e66e01d36\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.399448 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2htlq\" (UniqueName: \"kubernetes.io/projected/67003746-1204-4243-97c3-303e66e01d36-kube-api-access-2htlq\") pod \"67003746-1204-4243-97c3-303e66e01d36\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.399494 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-sg-core-conf-yaml\") pod \"67003746-1204-4243-97c3-303e66e01d36\" (UID: \"67003746-1204-4243-97c3-303e66e01d36\") " Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.400392 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "67003746-1204-4243-97c3-303e66e01d36" (UID: "67003746-1204-4243-97c3-303e66e01d36"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.400461 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "67003746-1204-4243-97c3-303e66e01d36" (UID: "67003746-1204-4243-97c3-303e66e01d36"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.409032 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-scripts" (OuterVolumeSpecName: "scripts") pod "67003746-1204-4243-97c3-303e66e01d36" (UID: "67003746-1204-4243-97c3-303e66e01d36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.409333 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67003746-1204-4243-97c3-303e66e01d36-kube-api-access-2htlq" (OuterVolumeSpecName: "kube-api-access-2htlq") pod "67003746-1204-4243-97c3-303e66e01d36" (UID: "67003746-1204-4243-97c3-303e66e01d36"). InnerVolumeSpecName "kube-api-access-2htlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.432810 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "67003746-1204-4243-97c3-303e66e01d36" (UID: "67003746-1204-4243-97c3-303e66e01d36"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.501192 4803 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.501395 4803 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.501452 4803 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/67003746-1204-4243-97c3-303e66e01d36-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.501565 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.501623 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2htlq\" (UniqueName: \"kubernetes.io/projected/67003746-1204-4243-97c3-303e66e01d36-kube-api-access-2htlq\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.525331 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67003746-1204-4243-97c3-303e66e01d36" (UID: "67003746-1204-4243-97c3-303e66e01d36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.540109 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-config-data" (OuterVolumeSpecName: "config-data") pod "67003746-1204-4243-97c3-303e66e01d36" (UID: "67003746-1204-4243-97c3-303e66e01d36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.603682 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.604041 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67003746-1204-4243-97c3-303e66e01d36-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.839484 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"67003746-1204-4243-97c3-303e66e01d36","Type":"ContainerDied","Data":"c5cf511c85a7bf9ac0a87dcfd08f5fa18ec93e44955d561b274c580c9cdf68e5"} Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.839582 4803 scope.go:117] "RemoveContainer" containerID="c30a8e3a3b072733209262bfd70afbfea55bc52ad7652c1d07f799ee6f6ed924" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.839739 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.879279 4803 scope.go:117] "RemoveContainer" containerID="4b3b6cc72ee6e4dac36de867cdbd28e6237c568dbfeda8457a36d3763f0d111d" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.884897 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.895105 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.903554 4803 scope.go:117] "RemoveContainer" containerID="228032a6f3fa52eee9c62a5bde539d4ff41641ab02bed721dd477fe67806451e" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.932959 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:18 crc kubenswrapper[4803]: E0320 17:36:18.933720 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="ceilometer-notification-agent" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.933741 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="ceilometer-notification-agent" Mar 20 17:36:18 crc kubenswrapper[4803]: E0320 17:36:18.933769 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="sg-core" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.933778 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="sg-core" Mar 20 17:36:18 crc kubenswrapper[4803]: E0320 17:36:18.933793 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="ceilometer-central-agent" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.933803 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="ceilometer-central-agent" Mar 20 17:36:18 crc kubenswrapper[4803]: E0320 17:36:18.933851 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="proxy-httpd" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.933860 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="proxy-httpd" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.934085 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="ceilometer-central-agent" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.934109 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="ceilometer-notification-agent" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.934125 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="proxy-httpd" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.934149 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="67003746-1204-4243-97c3-303e66e01d36" containerName="sg-core" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.936196 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.939462 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.943209 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.953259 4803 scope.go:117] "RemoveContainer" containerID="edcc4d4b9e240dd16c1b628de76c9676190d27b6018ebdfadbbfd42517922167" Mar 20 17:36:18 crc kubenswrapper[4803]: I0320 17:36:18.953724 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.019441 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-scripts\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.019608 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mpsk\" (UniqueName: \"kubernetes.io/projected/575514d4-ed01-4841-96ff-ef78dde0aabf-kube-api-access-7mpsk\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.019632 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-config-data\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.019649 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-run-httpd\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.019666 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.019707 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.019761 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-log-httpd\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121017 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mpsk\" (UniqueName: \"kubernetes.io/projected/575514d4-ed01-4841-96ff-ef78dde0aabf-kube-api-access-7mpsk\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121098 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-config-data\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121129 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-run-httpd\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121172 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121197 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121264 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-log-httpd\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121294 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-scripts\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121713 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-run-httpd\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.121833 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-log-httpd\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.125694 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-config-data\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.126037 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.127256 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.131268 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-scripts\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.138369 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mpsk\" (UniqueName: \"kubernetes.io/projected/575514d4-ed01-4841-96ff-ef78dde0aabf-kube-api-access-7mpsk\") pod \"ceilometer-0\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.275257 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.568638 4803 scope.go:117] "RemoveContainer" containerID="911b1a988b08afad42a215b61511b297323939eaaf8a72533b8ea62161019f7b" Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.748655 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:19 crc kubenswrapper[4803]: W0320 17:36:19.769097 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod575514d4_ed01_4841_96ff_ef78dde0aabf.slice/crio-c45083a5f42059e6b34b3d5f3c983fe8b324865124e906bad95e6a9436d9caae WatchSource:0}: Error finding container c45083a5f42059e6b34b3d5f3c983fe8b324865124e906bad95e6a9436d9caae: Status 404 returned error can't find the container with id c45083a5f42059e6b34b3d5f3c983fe8b324865124e906bad95e6a9436d9caae Mar 20 17:36:19 crc kubenswrapper[4803]: I0320 17:36:19.852364 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerStarted","Data":"c45083a5f42059e6b34b3d5f3c983fe8b324865124e906bad95e6a9436d9caae"} Mar 20 17:36:20 crc kubenswrapper[4803]: I0320 17:36:20.875577 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67003746-1204-4243-97c3-303e66e01d36" path="/var/lib/kubelet/pods/67003746-1204-4243-97c3-303e66e01d36/volumes" Mar 20 17:36:21 crc kubenswrapper[4803]: I0320 17:36:21.007797 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:21 crc kubenswrapper[4803]: I0320 17:36:21.877439 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerStarted","Data":"ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74"} Mar 20 17:36:22 crc kubenswrapper[4803]: I0320 17:36:22.887821 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerStarted","Data":"46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6"} Mar 20 17:36:23 crc kubenswrapper[4803]: I0320 17:36:23.907579 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerStarted","Data":"66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a"} Mar 20 17:36:26 crc kubenswrapper[4803]: I0320 17:36:26.936901 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerStarted","Data":"5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1"} Mar 20 17:36:26 crc kubenswrapper[4803]: I0320 17:36:26.937573 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:36:26 crc kubenswrapper[4803]: I0320 17:36:26.937324 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="proxy-httpd" containerID="cri-o://5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1" gracePeriod=30 Mar 20 17:36:26 crc kubenswrapper[4803]: I0320 17:36:26.937102 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="ceilometer-central-agent" containerID="cri-o://ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74" gracePeriod=30 Mar 20 17:36:26 crc kubenswrapper[4803]: I0320 17:36:26.937388 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="ceilometer-notification-agent" containerID="cri-o://46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6" gracePeriod=30 Mar 20 17:36:26 crc kubenswrapper[4803]: I0320 17:36:26.937393 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="sg-core" containerID="cri-o://66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a" gracePeriod=30 Mar 20 17:36:26 crc kubenswrapper[4803]: I0320 17:36:26.980640 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.964573483 podStartE2EDuration="8.980613553s" podCreationTimestamp="2026-03-20 17:36:18 +0000 UTC" firstStartedPulling="2026-03-20 17:36:19.817693786 +0000 UTC m=+1189.729285856" lastFinishedPulling="2026-03-20 17:36:25.833733806 +0000 UTC m=+1195.745325926" observedRunningTime="2026-03-20 17:36:26.96491413 +0000 UTC m=+1196.876506230" watchObservedRunningTime="2026-03-20 17:36:26.980613553 +0000 UTC m=+1196.892205623" Mar 20 17:36:27 crc kubenswrapper[4803]: I0320 17:36:27.950594 4803 generic.go:334] "Generic (PLEG): container finished" podID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerID="5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1" exitCode=0 Mar 20 17:36:27 crc kubenswrapper[4803]: I0320 17:36:27.950873 4803 generic.go:334] "Generic (PLEG): container finished" podID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerID="66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a" exitCode=2 Mar 20 17:36:27 crc kubenswrapper[4803]: I0320 17:36:27.950884 4803 generic.go:334] "Generic (PLEG): container finished" podID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerID="46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6" exitCode=0 Mar 20 17:36:27 crc kubenswrapper[4803]: I0320 17:36:27.950714 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerDied","Data":"5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1"} Mar 20 17:36:27 crc kubenswrapper[4803]: I0320 17:36:27.950927 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerDied","Data":"66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a"} Mar 20 17:36:27 crc kubenswrapper[4803]: I0320 17:36:27.950942 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerDied","Data":"46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6"} Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.784974 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.939812 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-run-httpd\") pod \"575514d4-ed01-4841-96ff-ef78dde0aabf\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.939965 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mpsk\" (UniqueName: \"kubernetes.io/projected/575514d4-ed01-4841-96ff-ef78dde0aabf-kube-api-access-7mpsk\") pod \"575514d4-ed01-4841-96ff-ef78dde0aabf\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940046 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-combined-ca-bundle\") pod \"575514d4-ed01-4841-96ff-ef78dde0aabf\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940091 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-scripts\") pod \"575514d4-ed01-4841-96ff-ef78dde0aabf\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940155 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-log-httpd\") pod \"575514d4-ed01-4841-96ff-ef78dde0aabf\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940190 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-sg-core-conf-yaml\") pod \"575514d4-ed01-4841-96ff-ef78dde0aabf\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940269 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-config-data\") pod \"575514d4-ed01-4841-96ff-ef78dde0aabf\" (UID: \"575514d4-ed01-4841-96ff-ef78dde0aabf\") " Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940430 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "575514d4-ed01-4841-96ff-ef78dde0aabf" (UID: "575514d4-ed01-4841-96ff-ef78dde0aabf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940700 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "575514d4-ed01-4841-96ff-ef78dde0aabf" (UID: "575514d4-ed01-4841-96ff-ef78dde0aabf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940836 4803 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.940862 4803 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/575514d4-ed01-4841-96ff-ef78dde0aabf-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.945955 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-scripts" (OuterVolumeSpecName: "scripts") pod "575514d4-ed01-4841-96ff-ef78dde0aabf" (UID: "575514d4-ed01-4841-96ff-ef78dde0aabf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.946836 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575514d4-ed01-4841-96ff-ef78dde0aabf-kube-api-access-7mpsk" (OuterVolumeSpecName: "kube-api-access-7mpsk") pod "575514d4-ed01-4841-96ff-ef78dde0aabf" (UID: "575514d4-ed01-4841-96ff-ef78dde0aabf"). InnerVolumeSpecName "kube-api-access-7mpsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.972599 4803 generic.go:334] "Generic (PLEG): container finished" podID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerID="ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74" exitCode=0 Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.973320 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerDied","Data":"ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74"} Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.973488 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"575514d4-ed01-4841-96ff-ef78dde0aabf","Type":"ContainerDied","Data":"c45083a5f42059e6b34b3d5f3c983fe8b324865124e906bad95e6a9436d9caae"} Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.973651 4803 scope.go:117] "RemoveContainer" containerID="5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.973927 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:28 crc kubenswrapper[4803]: I0320 17:36:28.985177 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "575514d4-ed01-4841-96ff-ef78dde0aabf" (UID: "575514d4-ed01-4841-96ff-ef78dde0aabf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.039119 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "575514d4-ed01-4841-96ff-ef78dde0aabf" (UID: "575514d4-ed01-4841-96ff-ef78dde0aabf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.042615 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mpsk\" (UniqueName: \"kubernetes.io/projected/575514d4-ed01-4841-96ff-ef78dde0aabf-kube-api-access-7mpsk\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.042896 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.043107 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.043366 4803 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.052411 4803 scope.go:117] "RemoveContainer" containerID="66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.075803 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-config-data" (OuterVolumeSpecName: "config-data") pod "575514d4-ed01-4841-96ff-ef78dde0aabf" (UID: "575514d4-ed01-4841-96ff-ef78dde0aabf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.091393 4803 scope.go:117] "RemoveContainer" containerID="46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.116811 4803 scope.go:117] "RemoveContainer" containerID="ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.140667 4803 scope.go:117] "RemoveContainer" containerID="5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1" Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.141309 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1\": container with ID starting with 5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1 not found: ID does not exist" containerID="5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.141378 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1"} err="failed to get container status \"5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1\": rpc error: code = NotFound desc = could not find container \"5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1\": container with ID starting with 5a442101016f716be6ad390db72ff59f9aee685bbef5b25430a6ce46f54dd5b1 not found: ID does not exist" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.141419 4803 scope.go:117] "RemoveContainer" containerID="66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a" Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.141912 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a\": container with ID starting with 66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a not found: ID does not exist" containerID="66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.141960 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a"} err="failed to get container status \"66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a\": rpc error: code = NotFound desc = could not find container \"66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a\": container with ID starting with 66819d6ced2d6ea88c8137cea71db25c08d0c29f2bc5aa3e0ebe51dbc6e5129a not found: ID does not exist" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.141994 4803 scope.go:117] "RemoveContainer" containerID="46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6" Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.142451 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6\": container with ID starting with 46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6 not found: ID does not exist" containerID="46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.142485 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6"} err="failed to get container status \"46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6\": rpc error: code = NotFound desc = could not find container \"46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6\": container with ID starting with 46687d2adc5e32a6b23c1cef848ae5f197265d28d97ef1ebe2021c5933e0c3f6 not found: ID does not exist" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.142508 4803 scope.go:117] "RemoveContainer" containerID="ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.145099 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/575514d4-ed01-4841-96ff-ef78dde0aabf-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.153312 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74\": container with ID starting with ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74 not found: ID does not exist" containerID="ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.153356 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74"} err="failed to get container status \"ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74\": rpc error: code = NotFound desc = could not find container \"ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74\": container with ID starting with ddd2278286bc094073b6bca8e094bae224fe30f9d9e282f1f6051ee98f69bb74 not found: ID does not exist" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.317755 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.325218 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.345766 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.346189 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="ceilometer-central-agent" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.346212 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="ceilometer-central-agent" Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.346243 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="ceilometer-notification-agent" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.346252 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="ceilometer-notification-agent" Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.346267 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="proxy-httpd" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.346276 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="proxy-httpd" Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.346351 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="sg-core" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.346380 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="sg-core" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.346806 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="sg-core" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.346840 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="proxy-httpd" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.346864 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="ceilometer-notification-agent" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.346880 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" containerName="ceilometer-central-agent" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.348842 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.355004 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.355483 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.362289 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.453701 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.453787 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqh5\" (UniqueName: \"kubernetes.io/projected/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-kube-api-access-bkqh5\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.453810 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.453831 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-run-httpd\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.453888 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-log-httpd\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.453999 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-scripts\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.454046 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-config-data\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: E0320 17:36:29.479220 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod575514d4_ed01_4841_96ff_ef78dde0aabf.slice\": RecentStats: unable to find data in memory cache]" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.556149 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-scripts\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.556202 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-config-data\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.556284 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.556359 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqh5\" (UniqueName: \"kubernetes.io/projected/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-kube-api-access-bkqh5\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.556382 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.556396 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-run-httpd\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.556450 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-log-httpd\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.557057 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-run-httpd\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.557217 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-log-httpd\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.560303 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-scripts\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.561153 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.562661 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-config-data\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.572204 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.573827 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqh5\" (UniqueName: \"kubernetes.io/projected/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-kube-api-access-bkqh5\") pod \"ceilometer-0\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " pod="openstack/ceilometer-0" Mar 20 17:36:29 crc kubenswrapper[4803]: I0320 17:36:29.766247 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:36:30 crc kubenswrapper[4803]: I0320 17:36:30.230459 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:36:30 crc kubenswrapper[4803]: W0320 17:36:30.233102 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e7c73c_3a6d_47f3_9d3d_4096eb3477c3.slice/crio-4683005a11243fb852a0c778a29aa6b50ec65ceae2336229a6179cc6eb0149e8 WatchSource:0}: Error finding container 4683005a11243fb852a0c778a29aa6b50ec65ceae2336229a6179cc6eb0149e8: Status 404 returned error can't find the container with id 4683005a11243fb852a0c778a29aa6b50ec65ceae2336229a6179cc6eb0149e8 Mar 20 17:36:30 crc kubenswrapper[4803]: I0320 17:36:30.864815 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575514d4-ed01-4841-96ff-ef78dde0aabf" path="/var/lib/kubelet/pods/575514d4-ed01-4841-96ff-ef78dde0aabf/volumes" Mar 20 17:36:30 crc kubenswrapper[4803]: I0320 17:36:30.997105 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerStarted","Data":"4683005a11243fb852a0c778a29aa6b50ec65ceae2336229a6179cc6eb0149e8"} Mar 20 17:36:32 crc kubenswrapper[4803]: I0320 17:36:32.012272 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerStarted","Data":"3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417"} Mar 20 17:36:32 crc kubenswrapper[4803]: I0320 17:36:32.013068 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerStarted","Data":"a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913"} Mar 20 17:36:32 crc kubenswrapper[4803]: I0320 17:36:32.014215 4803 generic.go:334] "Generic (PLEG): container finished" podID="c9eefa09-fd26-4111-a6e7-605629e9192c" containerID="0c2492519b69d5e624e7e1bfcdac8a3488889d3241da936111a3347141004dd1" exitCode=0 Mar 20 17:36:32 crc kubenswrapper[4803]: I0320 17:36:32.014256 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" event={"ID":"c9eefa09-fd26-4111-a6e7-605629e9192c","Type":"ContainerDied","Data":"0c2492519b69d5e624e7e1bfcdac8a3488889d3241da936111a3347141004dd1"} Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.025585 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerStarted","Data":"e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9"} Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.384309 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.447469 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk895\" (UniqueName: \"kubernetes.io/projected/c9eefa09-fd26-4111-a6e7-605629e9192c-kube-api-access-pk895\") pod \"c9eefa09-fd26-4111-a6e7-605629e9192c\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.447568 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-combined-ca-bundle\") pod \"c9eefa09-fd26-4111-a6e7-605629e9192c\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.447644 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-scripts\") pod \"c9eefa09-fd26-4111-a6e7-605629e9192c\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.447738 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-config-data\") pod \"c9eefa09-fd26-4111-a6e7-605629e9192c\" (UID: \"c9eefa09-fd26-4111-a6e7-605629e9192c\") " Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.452970 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9eefa09-fd26-4111-a6e7-605629e9192c-kube-api-access-pk895" (OuterVolumeSpecName: "kube-api-access-pk895") pod "c9eefa09-fd26-4111-a6e7-605629e9192c" (UID: "c9eefa09-fd26-4111-a6e7-605629e9192c"). InnerVolumeSpecName "kube-api-access-pk895". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.453624 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-scripts" (OuterVolumeSpecName: "scripts") pod "c9eefa09-fd26-4111-a6e7-605629e9192c" (UID: "c9eefa09-fd26-4111-a6e7-605629e9192c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.475643 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-config-data" (OuterVolumeSpecName: "config-data") pod "c9eefa09-fd26-4111-a6e7-605629e9192c" (UID: "c9eefa09-fd26-4111-a6e7-605629e9192c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.491040 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9eefa09-fd26-4111-a6e7-605629e9192c" (UID: "c9eefa09-fd26-4111-a6e7-605629e9192c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.549912 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk895\" (UniqueName: \"kubernetes.io/projected/c9eefa09-fd26-4111-a6e7-605629e9192c-kube-api-access-pk895\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.549958 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.549970 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:33 crc kubenswrapper[4803]: I0320 17:36:33.549980 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9eefa09-fd26-4111-a6e7-605629e9192c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.041907 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" event={"ID":"c9eefa09-fd26-4111-a6e7-605629e9192c","Type":"ContainerDied","Data":"8cfebf94d5468da26653e5b0d94f51d1d62cfc26c33f68ab58b261b0c1a24388"} Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.042940 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cfebf94d5468da26653e5b0d94f51d1d62cfc26c33f68ab58b261b0c1a24388" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.042020 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4jd6h" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.119316 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:36:34 crc kubenswrapper[4803]: E0320 17:36:34.119814 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9eefa09-fd26-4111-a6e7-605629e9192c" containerName="nova-cell0-conductor-db-sync" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.119836 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9eefa09-fd26-4111-a6e7-605629e9192c" containerName="nova-cell0-conductor-db-sync" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.120025 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9eefa09-fd26-4111-a6e7-605629e9192c" containerName="nova-cell0-conductor-db-sync" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.120763 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.123921 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-246vn" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.128575 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.144574 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.270046 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2e7092-e2b5-4f0f-bde6-892cb3660837-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.270114 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn5rj\" (UniqueName: \"kubernetes.io/projected/ad2e7092-e2b5-4f0f-bde6-892cb3660837-kube-api-access-hn5rj\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.270216 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2e7092-e2b5-4f0f-bde6-892cb3660837-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.372083 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2e7092-e2b5-4f0f-bde6-892cb3660837-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.372355 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn5rj\" (UniqueName: \"kubernetes.io/projected/ad2e7092-e2b5-4f0f-bde6-892cb3660837-kube-api-access-hn5rj\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.372499 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2e7092-e2b5-4f0f-bde6-892cb3660837-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.376199 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad2e7092-e2b5-4f0f-bde6-892cb3660837-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.377703 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad2e7092-e2b5-4f0f-bde6-892cb3660837-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.396613 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn5rj\" (UniqueName: \"kubernetes.io/projected/ad2e7092-e2b5-4f0f-bde6-892cb3660837-kube-api-access-hn5rj\") pod \"nova-cell0-conductor-0\" (UID: \"ad2e7092-e2b5-4f0f-bde6-892cb3660837\") " pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.445548 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:34 crc kubenswrapper[4803]: I0320 17:36:34.758693 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 17:36:35 crc kubenswrapper[4803]: I0320 17:36:35.056218 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ad2e7092-e2b5-4f0f-bde6-892cb3660837","Type":"ContainerStarted","Data":"5f7b6f3ddee4e50592f3de1840ab50e05168606d9d9a3519e5c80a1ee03590d7"} Mar 20 17:36:35 crc kubenswrapper[4803]: I0320 17:36:35.056578 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ad2e7092-e2b5-4f0f-bde6-892cb3660837","Type":"ContainerStarted","Data":"7e69647f32c80becd5392672511f511d27eca44e8412443f12d207edc113f86d"} Mar 20 17:36:35 crc kubenswrapper[4803]: I0320 17:36:35.056729 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:35 crc kubenswrapper[4803]: I0320 17:36:35.093624 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.093604145 podStartE2EDuration="1.093604145s" podCreationTimestamp="2026-03-20 17:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:35.083625136 +0000 UTC m=+1204.995217246" watchObservedRunningTime="2026-03-20 17:36:35.093604145 +0000 UTC m=+1205.005196225" Mar 20 17:36:36 crc kubenswrapper[4803]: I0320 17:36:36.071441 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerStarted","Data":"fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf"} Mar 20 17:36:36 crc kubenswrapper[4803]: I0320 17:36:36.109929 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2732809449999998 podStartE2EDuration="7.109909323s" podCreationTimestamp="2026-03-20 17:36:29 +0000 UTC" firstStartedPulling="2026-03-20 17:36:30.236875015 +0000 UTC m=+1200.148467085" lastFinishedPulling="2026-03-20 17:36:35.073503353 +0000 UTC m=+1204.985095463" observedRunningTime="2026-03-20 17:36:36.100632043 +0000 UTC m=+1206.012224203" watchObservedRunningTime="2026-03-20 17:36:36.109909323 +0000 UTC m=+1206.021501403" Mar 20 17:36:37 crc kubenswrapper[4803]: I0320 17:36:37.080184 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:36:38 crc kubenswrapper[4803]: I0320 17:36:38.246210 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:36:38 crc kubenswrapper[4803]: I0320 17:36:38.246629 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:36:38 crc kubenswrapper[4803]: I0320 17:36:38.246709 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:36:38 crc kubenswrapper[4803]: I0320 17:36:38.247790 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1bf505c950c915ca8f0c808968521bcabc9e302497ac83cfddf349e9c7d8bb55"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:36:38 crc kubenswrapper[4803]: I0320 17:36:38.247893 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://1bf505c950c915ca8f0c808968521bcabc9e302497ac83cfddf349e9c7d8bb55" gracePeriod=600 Mar 20 17:36:39 crc kubenswrapper[4803]: I0320 17:36:39.112312 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="1bf505c950c915ca8f0c808968521bcabc9e302497ac83cfddf349e9c7d8bb55" exitCode=0 Mar 20 17:36:39 crc kubenswrapper[4803]: I0320 17:36:39.112368 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"1bf505c950c915ca8f0c808968521bcabc9e302497ac83cfddf349e9c7d8bb55"} Mar 20 17:36:39 crc kubenswrapper[4803]: I0320 17:36:39.112811 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"7ddde2c658fdb702fa12402952a89565e3db989d32a4cbf09d114ff157fa7417"} Mar 20 17:36:39 crc kubenswrapper[4803]: I0320 17:36:39.112850 4803 scope.go:117] "RemoveContainer" containerID="6b7da3dcd15f360247e71d17a90f38b394013be03e278a625e311daaaf87c2cd" Mar 20 17:36:39 crc kubenswrapper[4803]: I0320 17:36:39.497030 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.089284 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vvprx"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.091461 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.104959 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.106392 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.113547 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vvprx"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.243066 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-config-data\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.243145 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhqbm\" (UniqueName: \"kubernetes.io/projected/6d2e08fd-0246-49d5-b874-93ba4a2915b6-kube-api-access-rhqbm\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.243194 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.243239 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-scripts\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.345768 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.345866 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-scripts\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.345975 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-config-data\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.346011 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhqbm\" (UniqueName: \"kubernetes.io/projected/6d2e08fd-0246-49d5-b874-93ba4a2915b6-kube-api-access-rhqbm\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.353989 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.361076 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-scripts\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.364805 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.369432 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-config-data\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.369616 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.378267 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.379024 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhqbm\" (UniqueName: \"kubernetes.io/projected/6d2e08fd-0246-49d5-b874-93ba4a2915b6-kube-api-access-rhqbm\") pod \"nova-cell0-cell-mapping-vvprx\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.379315 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.382608 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.384439 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.392064 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.408049 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.427040 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.549460 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-config-data\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.549507 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhlhc\" (UniqueName: \"kubernetes.io/projected/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-kube-api-access-vhlhc\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.549550 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.549585 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-config-data\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.549614 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-logs\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.549642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.549661 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc25876f-b61b-415a-bf28-b2966c79dc74-logs\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.549710 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65blj\" (UniqueName: \"kubernetes.io/projected/cc25876f-b61b-415a-bf28-b2966c79dc74-kube-api-access-65blj\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.557812 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-bcc55"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.559270 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.591264 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-bcc55"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.629127 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.630553 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.635544 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.642417 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659251 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659327 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65blj\" (UniqueName: \"kubernetes.io/projected/cc25876f-b61b-415a-bf28-b2966c79dc74-kube-api-access-65blj\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659476 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-config-data\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659499 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsfdx\" (UniqueName: \"kubernetes.io/projected/b6477cd4-502c-467c-8cc1-b706f62f249d-kube-api-access-bsfdx\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659536 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhlhc\" (UniqueName: \"kubernetes.io/projected/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-kube-api-access-vhlhc\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659581 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659640 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659667 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-config-data\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659729 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-logs\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659763 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659804 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659831 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc25876f-b61b-415a-bf28-b2966c79dc74-logs\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659859 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.659901 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-config\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.667667 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-config-data\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.670328 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-logs\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.683387 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.685584 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.685843 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.686609 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.686789 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc25876f-b61b-415a-bf28-b2966c79dc74-logs\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.687319 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-config-data\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.688472 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65blj\" (UniqueName: \"kubernetes.io/projected/cc25876f-b61b-415a-bf28-b2966c79dc74-kube-api-access-65blj\") pod \"nova-api-0\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.695461 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.704786 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhlhc\" (UniqueName: \"kubernetes.io/projected/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-kube-api-access-vhlhc\") pod \"nova-metadata-0\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.708447 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761669 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761741 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761769 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-config\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761793 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761878 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761899 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsfdx\" (UniqueName: \"kubernetes.io/projected/b6477cd4-502c-467c-8cc1-b706f62f249d-kube-api-access-bsfdx\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761929 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761956 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.761974 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x8b9\" (UniqueName: \"kubernetes.io/projected/12ba7dc7-8933-457c-ae40-4314d7ef54b5-kube-api-access-6x8b9\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.762622 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.762867 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.763187 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.764509 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.765139 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-config\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.778611 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsfdx\" (UniqueName: \"kubernetes.io/projected/b6477cd4-502c-467c-8cc1-b706f62f249d-kube-api-access-bsfdx\") pod \"dnsmasq-dns-bccf8f775-bcc55\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.858480 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.863626 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.863675 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpxqt\" (UniqueName: \"kubernetes.io/projected/8e2fb0a0-0a72-45b5-8907-272843a4d54f-kube-api-access-zpxqt\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.863703 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.863740 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.863768 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x8b9\" (UniqueName: \"kubernetes.io/projected/12ba7dc7-8933-457c-ae40-4314d7ef54b5-kube-api-access-6x8b9\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.863819 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-config-data\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.866878 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.868162 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.871094 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.882160 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x8b9\" (UniqueName: \"kubernetes.io/projected/12ba7dc7-8933-457c-ae40-4314d7ef54b5-kube-api-access-6x8b9\") pod \"nova-cell1-novncproxy-0\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.897738 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.952045 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.969656 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-config-data\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.969835 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.969860 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpxqt\" (UniqueName: \"kubernetes.io/projected/8e2fb0a0-0a72-45b5-8907-272843a4d54f-kube-api-access-zpxqt\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.978836 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.983949 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-config-data\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:40 crc kubenswrapper[4803]: I0320 17:36:40.993930 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpxqt\" (UniqueName: \"kubernetes.io/projected/8e2fb0a0-0a72-45b5-8907-272843a4d54f-kube-api-access-zpxqt\") pod \"nova-scheduler-0\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.020238 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vvprx"] Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.038340 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.165821 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vvprx" event={"ID":"6d2e08fd-0246-49d5-b874-93ba4a2915b6","Type":"ContainerStarted","Data":"b96d9d175d436fb03b935375d4286ffd930f1e2274486e5f6a72e791129bd896"} Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.184663 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkxj8"] Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.185966 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.188942 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.189096 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.193895 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkxj8"] Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.376833 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-scripts\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.377227 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-config-data\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.377457 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.377582 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8wp\" (UniqueName: \"kubernetes.io/projected/51db7102-cc0d-407f-8d55-2a96de3f6f63-kube-api-access-5p8wp\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.403558 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.496782 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-scripts\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.496834 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-config-data\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.498106 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.498154 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8wp\" (UniqueName: \"kubernetes.io/projected/51db7102-cc0d-407f-8d55-2a96de3f6f63-kube-api-access-5p8wp\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.521126 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-config-data\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.521615 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8wp\" (UniqueName: \"kubernetes.io/projected/51db7102-cc0d-407f-8d55-2a96de3f6f63-kube-api-access-5p8wp\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.529331 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: W0320 17:36:41.537463 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a4d899_9cb9_455b_8ca3_ac66c9f30964.slice/crio-1c4c5a3d664bb98d7fba7ec45456dc7762e62a30205cf2b263c784b0b048ae84 WatchSource:0}: Error finding container 1c4c5a3d664bb98d7fba7ec45456dc7762e62a30205cf2b263c784b0b048ae84: Status 404 returned error can't find the container with id 1c4c5a3d664bb98d7fba7ec45456dc7762e62a30205cf2b263c784b0b048ae84 Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.537675 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-scripts\") pod \"nova-cell1-conductor-db-sync-mkxj8\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.541446 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.584954 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-bcc55"] Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.593795 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.701783 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:41 crc kubenswrapper[4803]: I0320 17:36:41.808606 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.176064 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cc25876f-b61b-415a-bf28-b2966c79dc74","Type":"ContainerStarted","Data":"5980a2524c4b1554fddab955bb6bbd6d97b7ef2b6b1c36630a0703ccbc9e47c7"} Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.179547 4803 generic.go:334] "Generic (PLEG): container finished" podID="b6477cd4-502c-467c-8cc1-b706f62f249d" containerID="ae3678264c6fafa26d473651cf78cccc118b69c9b56377f978a54d4f9553f1e4" exitCode=0 Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.179651 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" event={"ID":"b6477cd4-502c-467c-8cc1-b706f62f249d","Type":"ContainerDied","Data":"ae3678264c6fafa26d473651cf78cccc118b69c9b56377f978a54d4f9553f1e4"} Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.179744 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" event={"ID":"b6477cd4-502c-467c-8cc1-b706f62f249d","Type":"ContainerStarted","Data":"6e1eef29f8347a3101854feb44063b61577bb4e6b4eab4090bb0ba97f446d385"} Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.181283 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5a4d899-9cb9-455b-8ca3-ac66c9f30964","Type":"ContainerStarted","Data":"1c4c5a3d664bb98d7fba7ec45456dc7762e62a30205cf2b263c784b0b048ae84"} Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.183130 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"12ba7dc7-8933-457c-ae40-4314d7ef54b5","Type":"ContainerStarted","Data":"453e92c0ea7d78ac7f190c022c895c5e8cf10a9794b3c1c5924bc74b90ddeff6"} Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.186329 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vvprx" event={"ID":"6d2e08fd-0246-49d5-b874-93ba4a2915b6","Type":"ContainerStarted","Data":"90e5c3c3e8794fc7317ca12d7bca72a137376bf9defdbc91d127b2f67d471087"} Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.189009 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8e2fb0a0-0a72-45b5-8907-272843a4d54f","Type":"ContainerStarted","Data":"3c011727a5231d7468d2ce22490ff7aaf058a402a4b2829a2b6d0c3d68cc4b70"} Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.235172 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vvprx" podStartSLOduration=2.235154076 podStartE2EDuration="2.235154076s" podCreationTimestamp="2026-03-20 17:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:42.222941417 +0000 UTC m=+1212.134533507" watchObservedRunningTime="2026-03-20 17:36:42.235154076 +0000 UTC m=+1212.146746136" Mar 20 17:36:42 crc kubenswrapper[4803]: I0320 17:36:42.365853 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkxj8"] Mar 20 17:36:42 crc kubenswrapper[4803]: W0320 17:36:42.406005 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51db7102_cc0d_407f_8d55_2a96de3f6f63.slice/crio-9d32d342ed0c2380b40268286a05b8c65b23afa1403f48056a28bcf285db66b6 WatchSource:0}: Error finding container 9d32d342ed0c2380b40268286a05b8c65b23afa1403f48056a28bcf285db66b6: Status 404 returned error can't find the container with id 9d32d342ed0c2380b40268286a05b8c65b23afa1403f48056a28bcf285db66b6 Mar 20 17:36:43 crc kubenswrapper[4803]: I0320 17:36:43.199712 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" event={"ID":"b6477cd4-502c-467c-8cc1-b706f62f249d","Type":"ContainerStarted","Data":"81a6ad1f271b17b2925da46daa67eb84639642beee0928c9d4179daf0501e2ed"} Mar 20 17:36:43 crc kubenswrapper[4803]: I0320 17:36:43.200162 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:43 crc kubenswrapper[4803]: I0320 17:36:43.201338 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" event={"ID":"51db7102-cc0d-407f-8d55-2a96de3f6f63","Type":"ContainerStarted","Data":"a62ea244ef295ac92f292c6d911909f44bc278ede51fdc5861157b3b71b73e41"} Mar 20 17:36:43 crc kubenswrapper[4803]: I0320 17:36:43.201383 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" event={"ID":"51db7102-cc0d-407f-8d55-2a96de3f6f63","Type":"ContainerStarted","Data":"9d32d342ed0c2380b40268286a05b8c65b23afa1403f48056a28bcf285db66b6"} Mar 20 17:36:43 crc kubenswrapper[4803]: I0320 17:36:43.227938 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" podStartSLOduration=3.22791417 podStartE2EDuration="3.22791417s" podCreationTimestamp="2026-03-20 17:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:43.217103479 +0000 UTC m=+1213.128695559" watchObservedRunningTime="2026-03-20 17:36:43.22791417 +0000 UTC m=+1213.139506240" Mar 20 17:36:43 crc kubenswrapper[4803]: I0320 17:36:43.245954 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" podStartSLOduration=2.245934266 podStartE2EDuration="2.245934266s" podCreationTimestamp="2026-03-20 17:36:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:43.237906719 +0000 UTC m=+1213.149498789" watchObservedRunningTime="2026-03-20 17:36:43.245934266 +0000 UTC m=+1213.157526356" Mar 20 17:36:44 crc kubenswrapper[4803]: I0320 17:36:44.142469 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:36:44 crc kubenswrapper[4803]: I0320 17:36:44.199890 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.222730 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5a4d899-9cb9-455b-8ca3-ac66c9f30964","Type":"ContainerStarted","Data":"f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6"} Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.223188 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5a4d899-9cb9-455b-8ca3-ac66c9f30964","Type":"ContainerStarted","Data":"83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc"} Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.222952 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerName="nova-metadata-metadata" containerID="cri-o://f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6" gracePeriod=30 Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.222836 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerName="nova-metadata-log" containerID="cri-o://83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc" gracePeriod=30 Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.224664 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"12ba7dc7-8933-457c-ae40-4314d7ef54b5","Type":"ContainerStarted","Data":"2339521c4ff2f7943a38f4a14fd5670e3f89c1ceea7b7fcac050e7ea5ea97ac1"} Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.224777 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="12ba7dc7-8933-457c-ae40-4314d7ef54b5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2339521c4ff2f7943a38f4a14fd5670e3f89c1ceea7b7fcac050e7ea5ea97ac1" gracePeriod=30 Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.231629 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8e2fb0a0-0a72-45b5-8907-272843a4d54f","Type":"ContainerStarted","Data":"d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda"} Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.237622 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cc25876f-b61b-415a-bf28-b2966c79dc74","Type":"ContainerStarted","Data":"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6"} Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.237650 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cc25876f-b61b-415a-bf28-b2966c79dc74","Type":"ContainerStarted","Data":"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9"} Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.245196 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.2375013089999998 podStartE2EDuration="5.245184245s" podCreationTimestamp="2026-03-20 17:36:40 +0000 UTC" firstStartedPulling="2026-03-20 17:36:41.539263969 +0000 UTC m=+1211.450856039" lastFinishedPulling="2026-03-20 17:36:44.546946905 +0000 UTC m=+1214.458538975" observedRunningTime="2026-03-20 17:36:45.24201747 +0000 UTC m=+1215.153609550" watchObservedRunningTime="2026-03-20 17:36:45.245184245 +0000 UTC m=+1215.156776315" Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.263213 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.13469645 podStartE2EDuration="5.263198171s" podCreationTimestamp="2026-03-20 17:36:40 +0000 UTC" firstStartedPulling="2026-03-20 17:36:41.412263757 +0000 UTC m=+1211.323855817" lastFinishedPulling="2026-03-20 17:36:44.540765468 +0000 UTC m=+1214.452357538" observedRunningTime="2026-03-20 17:36:45.261134575 +0000 UTC m=+1215.172726645" watchObservedRunningTime="2026-03-20 17:36:45.263198171 +0000 UTC m=+1215.174790241" Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.280907 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.332967381 podStartE2EDuration="5.280888557s" podCreationTimestamp="2026-03-20 17:36:40 +0000 UTC" firstStartedPulling="2026-03-20 17:36:41.595698689 +0000 UTC m=+1211.507290759" lastFinishedPulling="2026-03-20 17:36:44.543619865 +0000 UTC m=+1214.455211935" observedRunningTime="2026-03-20 17:36:45.276765046 +0000 UTC m=+1215.188357126" watchObservedRunningTime="2026-03-20 17:36:45.280888557 +0000 UTC m=+1215.192480627" Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.299346 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.488073659 podStartE2EDuration="5.299331304s" podCreationTimestamp="2026-03-20 17:36:40 +0000 UTC" firstStartedPulling="2026-03-20 17:36:41.730408968 +0000 UTC m=+1211.642001038" lastFinishedPulling="2026-03-20 17:36:44.541666613 +0000 UTC m=+1214.453258683" observedRunningTime="2026-03-20 17:36:45.297807673 +0000 UTC m=+1215.209399753" watchObservedRunningTime="2026-03-20 17:36:45.299331304 +0000 UTC m=+1215.210923374" Mar 20 17:36:45 crc kubenswrapper[4803]: I0320 17:36:45.952589 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:36:46 crc kubenswrapper[4803]: I0320 17:36:46.038768 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:36:46 crc kubenswrapper[4803]: I0320 17:36:46.252399 4803 generic.go:334] "Generic (PLEG): container finished" podID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerID="83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc" exitCode=143 Mar 20 17:36:46 crc kubenswrapper[4803]: I0320 17:36:46.252471 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5a4d899-9cb9-455b-8ca3-ac66c9f30964","Type":"ContainerDied","Data":"83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc"} Mar 20 17:36:48 crc kubenswrapper[4803]: I0320 17:36:48.280705 4803 generic.go:334] "Generic (PLEG): container finished" podID="6d2e08fd-0246-49d5-b874-93ba4a2915b6" containerID="90e5c3c3e8794fc7317ca12d7bca72a137376bf9defdbc91d127b2f67d471087" exitCode=0 Mar 20 17:36:48 crc kubenswrapper[4803]: I0320 17:36:48.280789 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vvprx" event={"ID":"6d2e08fd-0246-49d5-b874-93ba4a2915b6","Type":"ContainerDied","Data":"90e5c3c3e8794fc7317ca12d7bca72a137376bf9defdbc91d127b2f67d471087"} Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.711248 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.789804 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhqbm\" (UniqueName: \"kubernetes.io/projected/6d2e08fd-0246-49d5-b874-93ba4a2915b6-kube-api-access-rhqbm\") pod \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.789853 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-config-data\") pod \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.789896 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-scripts\") pod \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.790004 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-combined-ca-bundle\") pod \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\" (UID: \"6d2e08fd-0246-49d5-b874-93ba4a2915b6\") " Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.795864 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-scripts" (OuterVolumeSpecName: "scripts") pod "6d2e08fd-0246-49d5-b874-93ba4a2915b6" (UID: "6d2e08fd-0246-49d5-b874-93ba4a2915b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.797816 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d2e08fd-0246-49d5-b874-93ba4a2915b6-kube-api-access-rhqbm" (OuterVolumeSpecName: "kube-api-access-rhqbm") pod "6d2e08fd-0246-49d5-b874-93ba4a2915b6" (UID: "6d2e08fd-0246-49d5-b874-93ba4a2915b6"). InnerVolumeSpecName "kube-api-access-rhqbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.825792 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-config-data" (OuterVolumeSpecName: "config-data") pod "6d2e08fd-0246-49d5-b874-93ba4a2915b6" (UID: "6d2e08fd-0246-49d5-b874-93ba4a2915b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.850799 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d2e08fd-0246-49d5-b874-93ba4a2915b6" (UID: "6d2e08fd-0246-49d5-b874-93ba4a2915b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.892313 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.892338 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.892347 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d2e08fd-0246-49d5-b874-93ba4a2915b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:49 crc kubenswrapper[4803]: I0320 17:36:49.892357 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhqbm\" (UniqueName: \"kubernetes.io/projected/6d2e08fd-0246-49d5-b874-93ba4a2915b6-kube-api-access-rhqbm\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.306937 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vvprx" Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.306998 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vvprx" event={"ID":"6d2e08fd-0246-49d5-b874-93ba4a2915b6","Type":"ContainerDied","Data":"b96d9d175d436fb03b935375d4286ffd930f1e2274486e5f6a72e791129bd896"} Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.307873 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b96d9d175d436fb03b935375d4286ffd930f1e2274486e5f6a72e791129bd896" Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.310435 4803 generic.go:334] "Generic (PLEG): container finished" podID="51db7102-cc0d-407f-8d55-2a96de3f6f63" containerID="a62ea244ef295ac92f292c6d911909f44bc278ede51fdc5861157b3b71b73e41" exitCode=0 Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.310479 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" event={"ID":"51db7102-cc0d-407f-8d55-2a96de3f6f63","Type":"ContainerDied","Data":"a62ea244ef295ac92f292c6d911909f44bc278ede51fdc5861157b3b71b73e41"} Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.520879 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.521464 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerName="nova-api-log" containerID="cri-o://256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9" gracePeriod=30 Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.521655 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerName="nova-api-api" containerID="cri-o://d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6" gracePeriod=30 Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.537000 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.537210 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8e2fb0a0-0a72-45b5-8907-272843a4d54f" containerName="nova-scheduler-scheduler" containerID="cri-o://d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda" gracePeriod=30 Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.903786 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.981095 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cz888"] Mar 20 17:36:50 crc kubenswrapper[4803]: I0320 17:36:50.981322 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-cz888" podUID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" containerName="dnsmasq-dns" containerID="cri-o://21db21b0f47fef938d7eda7d1ad82c56a83a469f378ee281fae2661ad2d28a7c" gracePeriod=10 Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.190626 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.217383 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-config-data\") pod \"cc25876f-b61b-415a-bf28-b2966c79dc74\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.217779 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65blj\" (UniqueName: \"kubernetes.io/projected/cc25876f-b61b-415a-bf28-b2966c79dc74-kube-api-access-65blj\") pod \"cc25876f-b61b-415a-bf28-b2966c79dc74\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.217915 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc25876f-b61b-415a-bf28-b2966c79dc74-logs\") pod \"cc25876f-b61b-415a-bf28-b2966c79dc74\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.218008 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-combined-ca-bundle\") pod \"cc25876f-b61b-415a-bf28-b2966c79dc74\" (UID: \"cc25876f-b61b-415a-bf28-b2966c79dc74\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.218288 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc25876f-b61b-415a-bf28-b2966c79dc74-logs" (OuterVolumeSpecName: "logs") pod "cc25876f-b61b-415a-bf28-b2966c79dc74" (UID: "cc25876f-b61b-415a-bf28-b2966c79dc74"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.221104 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc25876f-b61b-415a-bf28-b2966c79dc74-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.223737 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc25876f-b61b-415a-bf28-b2966c79dc74-kube-api-access-65blj" (OuterVolumeSpecName: "kube-api-access-65blj") pod "cc25876f-b61b-415a-bf28-b2966c79dc74" (UID: "cc25876f-b61b-415a-bf28-b2966c79dc74"). InnerVolumeSpecName "kube-api-access-65blj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.249740 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-config-data" (OuterVolumeSpecName: "config-data") pod "cc25876f-b61b-415a-bf28-b2966c79dc74" (UID: "cc25876f-b61b-415a-bf28-b2966c79dc74"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.257761 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc25876f-b61b-415a-bf28-b2966c79dc74" (UID: "cc25876f-b61b-415a-bf28-b2966c79dc74"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.334203 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.334232 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65blj\" (UniqueName: \"kubernetes.io/projected/cc25876f-b61b-415a-bf28-b2966c79dc74-kube-api-access-65blj\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.334242 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc25876f-b61b-415a-bf28-b2966c79dc74-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.359319 4803 generic.go:334] "Generic (PLEG): container finished" podID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerID="d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6" exitCode=0 Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.359583 4803 generic.go:334] "Generic (PLEG): container finished" podID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerID="256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9" exitCode=143 Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.360732 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cc25876f-b61b-415a-bf28-b2966c79dc74","Type":"ContainerDied","Data":"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6"} Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.360950 4803 scope.go:117] "RemoveContainer" containerID="d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.361057 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.360924 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cc25876f-b61b-415a-bf28-b2966c79dc74","Type":"ContainerDied","Data":"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9"} Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.361123 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cc25876f-b61b-415a-bf28-b2966c79dc74","Type":"ContainerDied","Data":"5980a2524c4b1554fddab955bb6bbd6d97b7ef2b6b1c36630a0703ccbc9e47c7"} Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.366809 4803 generic.go:334] "Generic (PLEG): container finished" podID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" containerID="21db21b0f47fef938d7eda7d1ad82c56a83a469f378ee281fae2661ad2d28a7c" exitCode=0 Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.367024 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cz888" event={"ID":"22a4b5ea-5c7b-4b27-bacf-806ef044a297","Type":"ContainerDied","Data":"21db21b0f47fef938d7eda7d1ad82c56a83a469f378ee281fae2661ad2d28a7c"} Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.401174 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.403476 4803 scope.go:117] "RemoveContainer" containerID="256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.416424 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.433340 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:51 crc kubenswrapper[4803]: E0320 17:36:51.433832 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerName="nova-api-log" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.433856 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerName="nova-api-log" Mar 20 17:36:51 crc kubenswrapper[4803]: E0320 17:36:51.433869 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d2e08fd-0246-49d5-b874-93ba4a2915b6" containerName="nova-manage" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.433878 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d2e08fd-0246-49d5-b874-93ba4a2915b6" containerName="nova-manage" Mar 20 17:36:51 crc kubenswrapper[4803]: E0320 17:36:51.433914 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerName="nova-api-api" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.433924 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerName="nova-api-api" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.434130 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerName="nova-api-api" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.434151 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d2e08fd-0246-49d5-b874-93ba4a2915b6" containerName="nova-manage" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.434161 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" containerName="nova-api-log" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.435072 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.437317 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.445605 4803 scope.go:117] "RemoveContainer" containerID="d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6" Mar 20 17:36:51 crc kubenswrapper[4803]: E0320 17:36:51.446544 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6\": container with ID starting with d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6 not found: ID does not exist" containerID="d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.446572 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6"} err="failed to get container status \"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6\": rpc error: code = NotFound desc = could not find container \"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6\": container with ID starting with d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6 not found: ID does not exist" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.446591 4803 scope.go:117] "RemoveContainer" containerID="256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9" Mar 20 17:36:51 crc kubenswrapper[4803]: E0320 17:36:51.447805 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9\": container with ID starting with 256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9 not found: ID does not exist" containerID="256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.447824 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9"} err="failed to get container status \"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9\": rpc error: code = NotFound desc = could not find container \"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9\": container with ID starting with 256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9 not found: ID does not exist" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.447838 4803 scope.go:117] "RemoveContainer" containerID="d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.448052 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6"} err="failed to get container status \"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6\": rpc error: code = NotFound desc = could not find container \"d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6\": container with ID starting with d165229ad185af09d8871e2b5d2dd1112b8ceebdf4710a0d87edfea8e4f18be6 not found: ID does not exist" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.448069 4803 scope.go:117] "RemoveContainer" containerID="256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.448370 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9"} err="failed to get container status \"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9\": rpc error: code = NotFound desc = could not find container \"256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9\": container with ID starting with 256930f4192b53bfc678e76982ee04e0b016c4ae38061ea380062f7a2b1804f9 not found: ID does not exist" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.458511 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.471502 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.537678 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-swift-storage-0\") pod \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.537738 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-config\") pod \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.537852 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-sb\") pod \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.537906 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-nb\") pod \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.538077 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-svc\") pod \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.538130 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdm66\" (UniqueName: \"kubernetes.io/projected/22a4b5ea-5c7b-4b27-bacf-806ef044a297-kube-api-access-pdm66\") pod \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\" (UID: \"22a4b5ea-5c7b-4b27-bacf-806ef044a297\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.538510 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-config-data\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.538606 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.538675 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5bj6\" (UniqueName: \"kubernetes.io/projected/6145c682-fa69-4755-a845-883da359fffd-kube-api-access-p5bj6\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.538769 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6145c682-fa69-4755-a845-883da359fffd-logs\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.544602 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a4b5ea-5c7b-4b27-bacf-806ef044a297-kube-api-access-pdm66" (OuterVolumeSpecName: "kube-api-access-pdm66") pod "22a4b5ea-5c7b-4b27-bacf-806ef044a297" (UID: "22a4b5ea-5c7b-4b27-bacf-806ef044a297"). InnerVolumeSpecName "kube-api-access-pdm66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.589502 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "22a4b5ea-5c7b-4b27-bacf-806ef044a297" (UID: "22a4b5ea-5c7b-4b27-bacf-806ef044a297"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.594791 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "22a4b5ea-5c7b-4b27-bacf-806ef044a297" (UID: "22a4b5ea-5c7b-4b27-bacf-806ef044a297"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.596489 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22a4b5ea-5c7b-4b27-bacf-806ef044a297" (UID: "22a4b5ea-5c7b-4b27-bacf-806ef044a297"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.603685 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-config" (OuterVolumeSpecName: "config") pod "22a4b5ea-5c7b-4b27-bacf-806ef044a297" (UID: "22a4b5ea-5c7b-4b27-bacf-806ef044a297"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.607238 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "22a4b5ea-5c7b-4b27-bacf-806ef044a297" (UID: "22a4b5ea-5c7b-4b27-bacf-806ef044a297"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640237 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5bj6\" (UniqueName: \"kubernetes.io/projected/6145c682-fa69-4755-a845-883da359fffd-kube-api-access-p5bj6\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640367 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6145c682-fa69-4755-a845-883da359fffd-logs\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640441 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-config-data\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640502 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640586 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdm66\" (UniqueName: \"kubernetes.io/projected/22a4b5ea-5c7b-4b27-bacf-806ef044a297-kube-api-access-pdm66\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640599 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640626 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640636 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640645 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.640654 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a4b5ea-5c7b-4b27-bacf-806ef044a297-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.641008 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6145c682-fa69-4755-a845-883da359fffd-logs\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.647160 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.648307 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-config-data\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.659227 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5bj6\" (UniqueName: \"kubernetes.io/projected/6145c682-fa69-4755-a845-883da359fffd-kube-api-access-p5bj6\") pod \"nova-api-0\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.758343 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.904054 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.945001 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-config-data\") pod \"51db7102-cc0d-407f-8d55-2a96de3f6f63\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.945075 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p8wp\" (UniqueName: \"kubernetes.io/projected/51db7102-cc0d-407f-8d55-2a96de3f6f63-kube-api-access-5p8wp\") pod \"51db7102-cc0d-407f-8d55-2a96de3f6f63\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.945258 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-combined-ca-bundle\") pod \"51db7102-cc0d-407f-8d55-2a96de3f6f63\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.945317 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-scripts\") pod \"51db7102-cc0d-407f-8d55-2a96de3f6f63\" (UID: \"51db7102-cc0d-407f-8d55-2a96de3f6f63\") " Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.949614 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-scripts" (OuterVolumeSpecName: "scripts") pod "51db7102-cc0d-407f-8d55-2a96de3f6f63" (UID: "51db7102-cc0d-407f-8d55-2a96de3f6f63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.949900 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51db7102-cc0d-407f-8d55-2a96de3f6f63-kube-api-access-5p8wp" (OuterVolumeSpecName: "kube-api-access-5p8wp") pod "51db7102-cc0d-407f-8d55-2a96de3f6f63" (UID: "51db7102-cc0d-407f-8d55-2a96de3f6f63"). InnerVolumeSpecName "kube-api-access-5p8wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.974066 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-config-data" (OuterVolumeSpecName: "config-data") pod "51db7102-cc0d-407f-8d55-2a96de3f6f63" (UID: "51db7102-cc0d-407f-8d55-2a96de3f6f63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:51 crc kubenswrapper[4803]: I0320 17:36:51.987735 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51db7102-cc0d-407f-8d55-2a96de3f6f63" (UID: "51db7102-cc0d-407f-8d55-2a96de3f6f63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.050265 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.050300 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.050313 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p8wp\" (UniqueName: \"kubernetes.io/projected/51db7102-cc0d-407f-8d55-2a96de3f6f63-kube-api-access-5p8wp\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.050325 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51db7102-cc0d-407f-8d55-2a96de3f6f63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.248808 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.249616 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.356419 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpxqt\" (UniqueName: \"kubernetes.io/projected/8e2fb0a0-0a72-45b5-8907-272843a4d54f-kube-api-access-zpxqt\") pod \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.357435 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-config-data\") pod \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.357768 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-combined-ca-bundle\") pod \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\" (UID: \"8e2fb0a0-0a72-45b5-8907-272843a4d54f\") " Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.361598 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2fb0a0-0a72-45b5-8907-272843a4d54f-kube-api-access-zpxqt" (OuterVolumeSpecName: "kube-api-access-zpxqt") pod "8e2fb0a0-0a72-45b5-8907-272843a4d54f" (UID: "8e2fb0a0-0a72-45b5-8907-272843a4d54f"). InnerVolumeSpecName "kube-api-access-zpxqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.395198 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6145c682-fa69-4755-a845-883da359fffd","Type":"ContainerStarted","Data":"5b47ab503812b1aecc25785b49198d6b8fab1ceeec621d239cc25520b3ab4e34"} Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.396603 4803 generic.go:334] "Generic (PLEG): container finished" podID="8e2fb0a0-0a72-45b5-8907-272843a4d54f" containerID="d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda" exitCode=0 Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.396662 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8e2fb0a0-0a72-45b5-8907-272843a4d54f","Type":"ContainerDied","Data":"d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda"} Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.396678 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8e2fb0a0-0a72-45b5-8907-272843a4d54f","Type":"ContainerDied","Data":"3c011727a5231d7468d2ce22490ff7aaf058a402a4b2829a2b6d0c3d68cc4b70"} Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.396687 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.396695 4803 scope.go:117] "RemoveContainer" containerID="d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.400889 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-cz888" event={"ID":"22a4b5ea-5c7b-4b27-bacf-806ef044a297","Type":"ContainerDied","Data":"6165b5838dd6b0e6aac8645d3c0d43246de91f9f46c8d765aac5c8775147bc49"} Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.400949 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-cz888" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.425102 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e2fb0a0-0a72-45b5-8907-272843a4d54f" (UID: "8e2fb0a0-0a72-45b5-8907-272843a4d54f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.427046 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" event={"ID":"51db7102-cc0d-407f-8d55-2a96de3f6f63","Type":"ContainerDied","Data":"9d32d342ed0c2380b40268286a05b8c65b23afa1403f48056a28bcf285db66b6"} Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.427091 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d32d342ed0c2380b40268286a05b8c65b23afa1403f48056a28bcf285db66b6" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.427156 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-mkxj8" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.437025 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:36:52 crc kubenswrapper[4803]: E0320 17:36:52.438005 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51db7102-cc0d-407f-8d55-2a96de3f6f63" containerName="nova-cell1-conductor-db-sync" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.438025 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="51db7102-cc0d-407f-8d55-2a96de3f6f63" containerName="nova-cell1-conductor-db-sync" Mar 20 17:36:52 crc kubenswrapper[4803]: E0320 17:36:52.438042 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" containerName="init" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.438049 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" containerName="init" Mar 20 17:36:52 crc kubenswrapper[4803]: E0320 17:36:52.438070 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" containerName="dnsmasq-dns" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.438077 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" containerName="dnsmasq-dns" Mar 20 17:36:52 crc kubenswrapper[4803]: E0320 17:36:52.438089 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2fb0a0-0a72-45b5-8907-272843a4d54f" containerName="nova-scheduler-scheduler" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.438094 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2fb0a0-0a72-45b5-8907-272843a4d54f" containerName="nova-scheduler-scheduler" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.438261 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2fb0a0-0a72-45b5-8907-272843a4d54f" containerName="nova-scheduler-scheduler" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.438272 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" containerName="dnsmasq-dns" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.438293 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="51db7102-cc0d-407f-8d55-2a96de3f6f63" containerName="nova-cell1-conductor-db-sync" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.438888 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.443166 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.460368 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swvlt\" (UniqueName: \"kubernetes.io/projected/76a50897-11fb-467d-8b45-dde2ad95b1fd-kube-api-access-swvlt\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.460494 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a50897-11fb-467d-8b45-dde2ad95b1fd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.460558 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a50897-11fb-467d-8b45-dde2ad95b1fd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.460602 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpxqt\" (UniqueName: \"kubernetes.io/projected/8e2fb0a0-0a72-45b5-8907-272843a4d54f-kube-api-access-zpxqt\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.460612 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.463781 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-config-data" (OuterVolumeSpecName: "config-data") pod "8e2fb0a0-0a72-45b5-8907-272843a4d54f" (UID: "8e2fb0a0-0a72-45b5-8907-272843a4d54f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.465622 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.466710 4803 scope.go:117] "RemoveContainer" containerID="d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda" Mar 20 17:36:52 crc kubenswrapper[4803]: E0320 17:36:52.467087 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda\": container with ID starting with d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda not found: ID does not exist" containerID="d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.467130 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda"} err="failed to get container status \"d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda\": rpc error: code = NotFound desc = could not find container \"d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda\": container with ID starting with d5ab4e79de43c64b42c6ded890fb3b5d3c3ea3aa7c6dfe377ac17c209e64efda not found: ID does not exist" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.467149 4803 scope.go:117] "RemoveContainer" containerID="21db21b0f47fef938d7eda7d1ad82c56a83a469f378ee281fae2661ad2d28a7c" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.492926 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cz888"] Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.494165 4803 scope.go:117] "RemoveContainer" containerID="dcf1a30c9b0689299043ca01decfe0c25dc5ad1fe50242d439bd849ec0f2446f" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.501371 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-cz888"] Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.561732 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a50897-11fb-467d-8b45-dde2ad95b1fd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.561803 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a50897-11fb-467d-8b45-dde2ad95b1fd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.561845 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swvlt\" (UniqueName: \"kubernetes.io/projected/76a50897-11fb-467d-8b45-dde2ad95b1fd-kube-api-access-swvlt\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.561947 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e2fb0a0-0a72-45b5-8907-272843a4d54f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.565498 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76a50897-11fb-467d-8b45-dde2ad95b1fd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.567217 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76a50897-11fb-467d-8b45-dde2ad95b1fd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.575990 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swvlt\" (UniqueName: \"kubernetes.io/projected/76a50897-11fb-467d-8b45-dde2ad95b1fd-kube-api-access-swvlt\") pod \"nova-cell1-conductor-0\" (UID: \"76a50897-11fb-467d-8b45-dde2ad95b1fd\") " pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.748628 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.763218 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.768740 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.774489 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.775646 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.778072 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.787076 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.859082 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a4b5ea-5c7b-4b27-bacf-806ef044a297" path="/var/lib/kubelet/pods/22a4b5ea-5c7b-4b27-bacf-806ef044a297/volumes" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.859905 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2fb0a0-0a72-45b5-8907-272843a4d54f" path="/var/lib/kubelet/pods/8e2fb0a0-0a72-45b5-8907-272843a4d54f/volumes" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.860655 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc25876f-b61b-415a-bf28-b2966c79dc74" path="/var/lib/kubelet/pods/cc25876f-b61b-415a-bf28-b2966c79dc74/volumes" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.867231 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv27g\" (UniqueName: \"kubernetes.io/projected/12e443ee-a472-451b-8243-38a405aacc73-kube-api-access-rv27g\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.873800 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-config-data\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.874239 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.976468 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.977298 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv27g\" (UniqueName: \"kubernetes.io/projected/12e443ee-a472-451b-8243-38a405aacc73-kube-api-access-rv27g\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.977333 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-config-data\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.984414 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-config-data\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:52 crc kubenswrapper[4803]: I0320 17:36:52.999389 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:53 crc kubenswrapper[4803]: I0320 17:36:53.008823 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv27g\" (UniqueName: \"kubernetes.io/projected/12e443ee-a472-451b-8243-38a405aacc73-kube-api-access-rv27g\") pod \"nova-scheduler-0\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " pod="openstack/nova-scheduler-0" Mar 20 17:36:53 crc kubenswrapper[4803]: I0320 17:36:53.115334 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:36:53 crc kubenswrapper[4803]: W0320 17:36:53.288318 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76a50897_11fb_467d_8b45_dde2ad95b1fd.slice/crio-9369d4a5e446d64aa3d8a95828f2fd72815e5447a59a742e9b69e7b5484f9acd WatchSource:0}: Error finding container 9369d4a5e446d64aa3d8a95828f2fd72815e5447a59a742e9b69e7b5484f9acd: Status 404 returned error can't find the container with id 9369d4a5e446d64aa3d8a95828f2fd72815e5447a59a742e9b69e7b5484f9acd Mar 20 17:36:53 crc kubenswrapper[4803]: I0320 17:36:53.289246 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 17:36:53 crc kubenswrapper[4803]: I0320 17:36:53.439982 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"76a50897-11fb-467d-8b45-dde2ad95b1fd","Type":"ContainerStarted","Data":"9369d4a5e446d64aa3d8a95828f2fd72815e5447a59a742e9b69e7b5484f9acd"} Mar 20 17:36:53 crc kubenswrapper[4803]: I0320 17:36:53.442789 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6145c682-fa69-4755-a845-883da359fffd","Type":"ContainerStarted","Data":"47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2"} Mar 20 17:36:53 crc kubenswrapper[4803]: I0320 17:36:53.442817 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6145c682-fa69-4755-a845-883da359fffd","Type":"ContainerStarted","Data":"608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b"} Mar 20 17:36:53 crc kubenswrapper[4803]: I0320 17:36:53.466774 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.466755163 podStartE2EDuration="2.466755163s" podCreationTimestamp="2026-03-20 17:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:53.464251985 +0000 UTC m=+1223.375844095" watchObservedRunningTime="2026-03-20 17:36:53.466755163 +0000 UTC m=+1223.378347243" Mar 20 17:36:53 crc kubenswrapper[4803]: W0320 17:36:53.580968 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12e443ee_a472_451b_8243_38a405aacc73.slice/crio-81316e82965aacf419aa44b13a00ab66e688350e2ee72ecc513f5b7f88e31d52 WatchSource:0}: Error finding container 81316e82965aacf419aa44b13a00ab66e688350e2ee72ecc513f5b7f88e31d52: Status 404 returned error can't find the container with id 81316e82965aacf419aa44b13a00ab66e688350e2ee72ecc513f5b7f88e31d52 Mar 20 17:36:53 crc kubenswrapper[4803]: I0320 17:36:53.586966 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:36:54 crc kubenswrapper[4803]: I0320 17:36:54.462514 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"76a50897-11fb-467d-8b45-dde2ad95b1fd","Type":"ContainerStarted","Data":"935075d8796a95e714a8341d515f04ab9e51fd80b08c321105e4bda5f777e019"} Mar 20 17:36:54 crc kubenswrapper[4803]: I0320 17:36:54.462986 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 17:36:54 crc kubenswrapper[4803]: I0320 17:36:54.465976 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12e443ee-a472-451b-8243-38a405aacc73","Type":"ContainerStarted","Data":"5cb960c5d486708e2feea8bbaca388c49dfbc83ed666e179c283fcd3badbf2ae"} Mar 20 17:36:54 crc kubenswrapper[4803]: I0320 17:36:54.466049 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12e443ee-a472-451b-8243-38a405aacc73","Type":"ContainerStarted","Data":"81316e82965aacf419aa44b13a00ab66e688350e2ee72ecc513f5b7f88e31d52"} Mar 20 17:36:54 crc kubenswrapper[4803]: I0320 17:36:54.484460 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.484439038 podStartE2EDuration="2.484439038s" podCreationTimestamp="2026-03-20 17:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:54.477449 +0000 UTC m=+1224.389041110" watchObservedRunningTime="2026-03-20 17:36:54.484439038 +0000 UTC m=+1224.396031108" Mar 20 17:36:54 crc kubenswrapper[4803]: I0320 17:36:54.503239 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.503219874 podStartE2EDuration="2.503219874s" podCreationTimestamp="2026-03-20 17:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:36:54.501767595 +0000 UTC m=+1224.413359675" watchObservedRunningTime="2026-03-20 17:36:54.503219874 +0000 UTC m=+1224.414811954" Mar 20 17:36:58 crc kubenswrapper[4803]: I0320 17:36:58.115489 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:36:58 crc kubenswrapper[4803]: I0320 17:36:58.872041 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:36:58 crc kubenswrapper[4803]: I0320 17:36:58.872379 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:36:59 crc kubenswrapper[4803]: I0320 17:36:59.800210 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:37:01 crc kubenswrapper[4803]: I0320 17:37:01.759551 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:37:01 crc kubenswrapper[4803]: I0320 17:37:01.759874 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:37:02 crc kubenswrapper[4803]: I0320 17:37:02.799584 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 17:37:02 crc kubenswrapper[4803]: I0320 17:37:02.800780 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:02 crc kubenswrapper[4803]: I0320 17:37:02.842912 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:03 crc kubenswrapper[4803]: I0320 17:37:03.116106 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:37:03 crc kubenswrapper[4803]: I0320 17:37:03.140516 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:37:03 crc kubenswrapper[4803]: I0320 17:37:03.449487 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:37:03 crc kubenswrapper[4803]: I0320 17:37:03.449964 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3df34a16-c42d-48fa-90d4-711071acb1a8" containerName="kube-state-metrics" containerID="cri-o://be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d" gracePeriod=30 Mar 20 17:37:03 crc kubenswrapper[4803]: I0320 17:37:03.598616 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:37:03 crc kubenswrapper[4803]: I0320 17:37:03.920166 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.020730 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcjrb\" (UniqueName: \"kubernetes.io/projected/3df34a16-c42d-48fa-90d4-711071acb1a8-kube-api-access-lcjrb\") pod \"3df34a16-c42d-48fa-90d4-711071acb1a8\" (UID: \"3df34a16-c42d-48fa-90d4-711071acb1a8\") " Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.046127 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df34a16-c42d-48fa-90d4-711071acb1a8-kube-api-access-lcjrb" (OuterVolumeSpecName: "kube-api-access-lcjrb") pod "3df34a16-c42d-48fa-90d4-711071acb1a8" (UID: "3df34a16-c42d-48fa-90d4-711071acb1a8"). InnerVolumeSpecName "kube-api-access-lcjrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.122927 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcjrb\" (UniqueName: \"kubernetes.io/projected/3df34a16-c42d-48fa-90d4-711071acb1a8-kube-api-access-lcjrb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.566434 4803 generic.go:334] "Generic (PLEG): container finished" podID="3df34a16-c42d-48fa-90d4-711071acb1a8" containerID="be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d" exitCode=2 Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.566480 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3df34a16-c42d-48fa-90d4-711071acb1a8","Type":"ContainerDied","Data":"be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d"} Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.566753 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3df34a16-c42d-48fa-90d4-711071acb1a8","Type":"ContainerDied","Data":"ad10c7c858414480d4cf90af3d454ebc2fe2d6af5404defe34bf65b8a781ac5b"} Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.566788 4803 scope.go:117] "RemoveContainer" containerID="be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.566555 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.590934 4803 scope.go:117] "RemoveContainer" containerID="be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d" Mar 20 17:37:04 crc kubenswrapper[4803]: E0320 17:37:04.591506 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d\": container with ID starting with be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d not found: ID does not exist" containerID="be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.591583 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d"} err="failed to get container status \"be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d\": rpc error: code = NotFound desc = could not find container \"be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d\": container with ID starting with be97315161b2032ce71a55f2a223a517c34baace7b5ea48b3489acb374c7933d not found: ID does not exist" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.601266 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.610170 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.621200 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:37:04 crc kubenswrapper[4803]: E0320 17:37:04.621760 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df34a16-c42d-48fa-90d4-711071acb1a8" containerName="kube-state-metrics" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.621826 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df34a16-c42d-48fa-90d4-711071acb1a8" containerName="kube-state-metrics" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.622073 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df34a16-c42d-48fa-90d4-711071acb1a8" containerName="kube-state-metrics" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.622734 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.625220 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.625899 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.641770 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.734110 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.734318 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zn4\" (UniqueName: \"kubernetes.io/projected/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-api-access-b6zn4\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.734544 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.734655 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.836947 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.837267 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.837395 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.837510 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6zn4\" (UniqueName: \"kubernetes.io/projected/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-api-access-b6zn4\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.842074 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.842090 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.843237 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e13d6de9-6ef6-4194-98da-c8fee814f4d1-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.859869 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df34a16-c42d-48fa-90d4-711071acb1a8" path="/var/lib/kubelet/pods/3df34a16-c42d-48fa-90d4-711071acb1a8/volumes" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.874639 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6zn4\" (UniqueName: \"kubernetes.io/projected/e13d6de9-6ef6-4194-98da-c8fee814f4d1-kube-api-access-b6zn4\") pod \"kube-state-metrics-0\" (UID: \"e13d6de9-6ef6-4194-98da-c8fee814f4d1\") " pod="openstack/kube-state-metrics-0" Mar 20 17:37:04 crc kubenswrapper[4803]: I0320 17:37:04.945153 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 17:37:05 crc kubenswrapper[4803]: I0320 17:37:05.443927 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 17:37:05 crc kubenswrapper[4803]: I0320 17:37:05.490607 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:05 crc kubenswrapper[4803]: I0320 17:37:05.490952 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="ceilometer-central-agent" containerID="cri-o://a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913" gracePeriod=30 Mar 20 17:37:05 crc kubenswrapper[4803]: I0320 17:37:05.491024 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="ceilometer-notification-agent" containerID="cri-o://3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417" gracePeriod=30 Mar 20 17:37:05 crc kubenswrapper[4803]: I0320 17:37:05.491029 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="proxy-httpd" containerID="cri-o://fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf" gracePeriod=30 Mar 20 17:37:05 crc kubenswrapper[4803]: I0320 17:37:05.491060 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="sg-core" containerID="cri-o://e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9" gracePeriod=30 Mar 20 17:37:05 crc kubenswrapper[4803]: I0320 17:37:05.577096 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e13d6de9-6ef6-4194-98da-c8fee814f4d1","Type":"ContainerStarted","Data":"c29dabc791a77a91d70a0ffc793cb6539868e6530ae103fbfbb3212b3f9ee932"} Mar 20 17:37:06 crc kubenswrapper[4803]: I0320 17:37:06.591113 4803 generic.go:334] "Generic (PLEG): container finished" podID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerID="fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf" exitCode=0 Mar 20 17:37:06 crc kubenswrapper[4803]: I0320 17:37:06.591450 4803 generic.go:334] "Generic (PLEG): container finished" podID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerID="e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9" exitCode=2 Mar 20 17:37:06 crc kubenswrapper[4803]: I0320 17:37:06.591462 4803 generic.go:334] "Generic (PLEG): container finished" podID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerID="a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913" exitCode=0 Mar 20 17:37:06 crc kubenswrapper[4803]: I0320 17:37:06.591203 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerDied","Data":"fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf"} Mar 20 17:37:06 crc kubenswrapper[4803]: I0320 17:37:06.591508 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerDied","Data":"e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9"} Mar 20 17:37:06 crc kubenswrapper[4803]: I0320 17:37:06.591560 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerDied","Data":"a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913"} Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.290981 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.409018 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-run-httpd\") pod \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.409084 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-config-data\") pod \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.409169 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-combined-ca-bundle\") pod \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.409199 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-sg-core-conf-yaml\") pod \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.409271 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkqh5\" (UniqueName: \"kubernetes.io/projected/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-kube-api-access-bkqh5\") pod \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.409924 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" (UID: "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.409373 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-scripts\") pod \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.410286 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-log-httpd\") pod \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\" (UID: \"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3\") " Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.410934 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" (UID: "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.411578 4803 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.411620 4803 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.413443 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-scripts" (OuterVolumeSpecName: "scripts") pod "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" (UID: "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.413744 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-kube-api-access-bkqh5" (OuterVolumeSpecName: "kube-api-access-bkqh5") pod "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" (UID: "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3"). InnerVolumeSpecName "kube-api-access-bkqh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.461793 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" (UID: "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.508675 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" (UID: "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.513234 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.513260 4803 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.513272 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkqh5\" (UniqueName: \"kubernetes.io/projected/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-kube-api-access-bkqh5\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.513287 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.547640 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-config-data" (OuterVolumeSpecName: "config-data") pod "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" (UID: "d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.614820 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.621951 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e13d6de9-6ef6-4194-98da-c8fee814f4d1","Type":"ContainerStarted","Data":"1f36b4555535894385345ad59c35091f562c13ee5f522ca52a20f456d68ae189"} Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.622098 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.626160 4803 generic.go:334] "Generic (PLEG): container finished" podID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerID="3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417" exitCode=0 Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.626233 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerDied","Data":"3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417"} Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.626249 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.626286 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3","Type":"ContainerDied","Data":"4683005a11243fb852a0c778a29aa6b50ec65ceae2336229a6179cc6eb0149e8"} Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.626311 4803 scope.go:117] "RemoveContainer" containerID="fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.655810 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.517297579 podStartE2EDuration="4.65578616s" podCreationTimestamp="2026-03-20 17:37:04 +0000 UTC" firstStartedPulling="2026-03-20 17:37:05.451731254 +0000 UTC m=+1235.363323354" lastFinishedPulling="2026-03-20 17:37:07.590219855 +0000 UTC m=+1237.501811935" observedRunningTime="2026-03-20 17:37:08.648019931 +0000 UTC m=+1238.559612061" watchObservedRunningTime="2026-03-20 17:37:08.65578616 +0000 UTC m=+1238.567378230" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.703475 4803 scope.go:117] "RemoveContainer" containerID="e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.713246 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.723684 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.740446 4803 scope.go:117] "RemoveContainer" containerID="3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.752500 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:08 crc kubenswrapper[4803]: E0320 17:37:08.753054 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="ceilometer-central-agent" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.753075 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="ceilometer-central-agent" Mar 20 17:37:08 crc kubenswrapper[4803]: E0320 17:37:08.753089 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="sg-core" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.753098 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="sg-core" Mar 20 17:37:08 crc kubenswrapper[4803]: E0320 17:37:08.753118 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="proxy-httpd" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.753126 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="proxy-httpd" Mar 20 17:37:08 crc kubenswrapper[4803]: E0320 17:37:08.753147 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="ceilometer-notification-agent" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.753156 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="ceilometer-notification-agent" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.754100 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="sg-core" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.754135 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="proxy-httpd" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.754164 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="ceilometer-central-agent" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.754175 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" containerName="ceilometer-notification-agent" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.756392 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.762340 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.762543 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.762764 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.762921 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.774171 4803 scope.go:117] "RemoveContainer" containerID="a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.803864 4803 scope.go:117] "RemoveContainer" containerID="fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf" Mar 20 17:37:08 crc kubenswrapper[4803]: E0320 17:37:08.804237 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf\": container with ID starting with fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf not found: ID does not exist" containerID="fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.804271 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf"} err="failed to get container status \"fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf\": rpc error: code = NotFound desc = could not find container \"fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf\": container with ID starting with fcf48771ac12f2bc5691f9c8a6e20e3979a77c561981b8348db891fef8477bbf not found: ID does not exist" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.804297 4803 scope.go:117] "RemoveContainer" containerID="e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9" Mar 20 17:37:08 crc kubenswrapper[4803]: E0320 17:37:08.804478 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9\": container with ID starting with e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9 not found: ID does not exist" containerID="e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.804505 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9"} err="failed to get container status \"e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9\": rpc error: code = NotFound desc = could not find container \"e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9\": container with ID starting with e99cac24d9057c99a6396b1c2e8728181c864c31f2f2fd9848880417e2df60e9 not found: ID does not exist" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.804542 4803 scope.go:117] "RemoveContainer" containerID="3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417" Mar 20 17:37:08 crc kubenswrapper[4803]: E0320 17:37:08.804752 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417\": container with ID starting with 3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417 not found: ID does not exist" containerID="3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.804777 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417"} err="failed to get container status \"3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417\": rpc error: code = NotFound desc = could not find container \"3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417\": container with ID starting with 3249cbcf15fb827e71ef10c99e6fbd758e9ca44343c71a79d0aa8b791d56d417 not found: ID does not exist" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.804795 4803 scope.go:117] "RemoveContainer" containerID="a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913" Mar 20 17:37:08 crc kubenswrapper[4803]: E0320 17:37:08.805959 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913\": container with ID starting with a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913 not found: ID does not exist" containerID="a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.806009 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913"} err="failed to get container status \"a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913\": rpc error: code = NotFound desc = could not find container \"a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913\": container with ID starting with a1ad836e729b1dd3fb08e193952c6187d7b3210e872c86ea8d74c33665707913 not found: ID does not exist" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.819981 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.820117 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-log-httpd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.820208 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.820307 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-run-httpd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.820489 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.820596 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-config-data\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.820720 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-scripts\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.820784 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djggd\" (UniqueName: \"kubernetes.io/projected/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-kube-api-access-djggd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.858961 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3" path="/var/lib/kubelet/pods/d8e7c73c-3a6d-47f3-9d3d-4096eb3477c3/volumes" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.922140 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.922183 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-config-data\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.922235 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-scripts\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.922260 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djggd\" (UniqueName: \"kubernetes.io/projected/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-kube-api-access-djggd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.922310 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.923153 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-log-httpd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.923177 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.923227 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-run-httpd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.923904 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-log-httpd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.923931 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-run-httpd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.926476 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.936927 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.937091 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.937461 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-scripts\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.939162 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djggd\" (UniqueName: \"kubernetes.io/projected/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-kube-api-access-djggd\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:08 crc kubenswrapper[4803]: I0320 17:37:08.940398 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-config-data\") pod \"ceilometer-0\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " pod="openstack/ceilometer-0" Mar 20 17:37:09 crc kubenswrapper[4803]: I0320 17:37:09.108620 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:09 crc kubenswrapper[4803]: I0320 17:37:09.600821 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:09 crc kubenswrapper[4803]: I0320 17:37:09.635773 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerStarted","Data":"f093275566d728b23d92bf0e241625c05c6d745ef127aa1b2c5acad5f2e9c59f"} Mar 20 17:37:09 crc kubenswrapper[4803]: I0320 17:37:09.759107 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:37:09 crc kubenswrapper[4803]: I0320 17:37:09.759159 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:37:11 crc kubenswrapper[4803]: I0320 17:37:11.656798 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerStarted","Data":"afef0bc2eab04e4663cff0472df588bd41cf087eacf339adfc6a0aeaed353352"} Mar 20 17:37:11 crc kubenswrapper[4803]: I0320 17:37:11.763817 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:37:11 crc kubenswrapper[4803]: I0320 17:37:11.766386 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:37:11 crc kubenswrapper[4803]: I0320 17:37:11.769110 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:37:12 crc kubenswrapper[4803]: I0320 17:37:12.674430 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:37:12 crc kubenswrapper[4803]: I0320 17:37:12.928822 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mxk8r"] Mar 20 17:37:12 crc kubenswrapper[4803]: I0320 17:37:12.948787 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mxk8r"] Mar 20 17:37:12 crc kubenswrapper[4803]: I0320 17:37:12.948910 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.013905 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-config\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.014002 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.014029 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psbfw\" (UniqueName: \"kubernetes.io/projected/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-kube-api-access-psbfw\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.014101 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.014119 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.014149 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.115928 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.116268 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psbfw\" (UniqueName: \"kubernetes.io/projected/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-kube-api-access-psbfw\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.116330 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.116353 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.116381 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.116440 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-config\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.116989 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.117196 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-config\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.117288 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.117566 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.117798 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.140664 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psbfw\" (UniqueName: \"kubernetes.io/projected/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-kube-api-access-psbfw\") pod \"dnsmasq-dns-cd5cbd7b9-mxk8r\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.267032 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.682826 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerStarted","Data":"29c149d33bdc0c586f07218cad9624861034639f49e9c544f0106e67d704c690"} Mar 20 17:37:13 crc kubenswrapper[4803]: I0320 17:37:13.794780 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mxk8r"] Mar 20 17:37:13 crc kubenswrapper[4803]: W0320 17:37:13.803558 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a290344_3b36_4e1c_a94e_f9c31b7b8fe0.slice/crio-c1393bac5be2b0f622a4ba716ffc58c78a440dbd0bdac02e80e5f7f446566d36 WatchSource:0}: Error finding container c1393bac5be2b0f622a4ba716ffc58c78a440dbd0bdac02e80e5f7f446566d36: Status 404 returned error can't find the container with id c1393bac5be2b0f622a4ba716ffc58c78a440dbd0bdac02e80e5f7f446566d36 Mar 20 17:37:14 crc kubenswrapper[4803]: I0320 17:37:14.693159 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerStarted","Data":"ef46562dba43743d6d3b5a87301072fb0c4a928647f7f71bc258530d740ea84d"} Mar 20 17:37:14 crc kubenswrapper[4803]: I0320 17:37:14.695044 4803 generic.go:334] "Generic (PLEG): container finished" podID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" containerID="2c20443cf723ee8f0def494ccc438fd0d53ae77c16b6fc07da8299222a2245b5" exitCode=0 Mar 20 17:37:14 crc kubenswrapper[4803]: I0320 17:37:14.695129 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" event={"ID":"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0","Type":"ContainerDied","Data":"2c20443cf723ee8f0def494ccc438fd0d53ae77c16b6fc07da8299222a2245b5"} Mar 20 17:37:14 crc kubenswrapper[4803]: I0320 17:37:14.695161 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" event={"ID":"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0","Type":"ContainerStarted","Data":"c1393bac5be2b0f622a4ba716ffc58c78a440dbd0bdac02e80e5f7f446566d36"} Mar 20 17:37:14 crc kubenswrapper[4803]: I0320 17:37:14.969690 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.119004 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.223147 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.667293 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.707935 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" event={"ID":"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0","Type":"ContainerStarted","Data":"246a4b0f715b2b9961f880a5b5939f4b4026cb26304f6dc56601a263a444b0a3"} Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.707992 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.710717 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"12ba7dc7-8933-457c-ae40-4314d7ef54b5","Type":"ContainerDied","Data":"2339521c4ff2f7943a38f4a14fd5670e3f89c1ceea7b7fcac050e7ea5ea97ac1"} Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.710622 4803 generic.go:334] "Generic (PLEG): container finished" podID="12ba7dc7-8933-457c-ae40-4314d7ef54b5" containerID="2339521c4ff2f7943a38f4a14fd5670e3f89c1ceea7b7fcac050e7ea5ea97ac1" exitCode=137 Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.714829 4803 generic.go:334] "Generic (PLEG): container finished" podID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerID="f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6" exitCode=137 Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.714991 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-log" containerID="cri-o://608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b" gracePeriod=30 Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.715653 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-api" containerID="cri-o://47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2" gracePeriod=30 Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.715759 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5a4d899-9cb9-455b-8ca3-ac66c9f30964","Type":"ContainerDied","Data":"f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6"} Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.715786 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5a4d899-9cb9-455b-8ca3-ac66c9f30964","Type":"ContainerDied","Data":"1c4c5a3d664bb98d7fba7ec45456dc7762e62a30205cf2b263c784b0b048ae84"} Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.716088 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.716111 4803 scope.go:117] "RemoveContainer" containerID="f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.731979 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" podStartSLOduration=3.731960531 podStartE2EDuration="3.731960531s" podCreationTimestamp="2026-03-20 17:37:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:15.728775065 +0000 UTC m=+1245.640367145" watchObservedRunningTime="2026-03-20 17:37:15.731960531 +0000 UTC m=+1245.643552601" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.768852 4803 scope.go:117] "RemoveContainer" containerID="83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.769056 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhlhc\" (UniqueName: \"kubernetes.io/projected/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-kube-api-access-vhlhc\") pod \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.769186 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-combined-ca-bundle\") pod \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.769324 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-config-data\") pod \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.769444 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-logs\") pod \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\" (UID: \"e5a4d899-9cb9-455b-8ca3-ac66c9f30964\") " Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.770191 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-logs" (OuterVolumeSpecName: "logs") pod "e5a4d899-9cb9-455b-8ca3-ac66c9f30964" (UID: "e5a4d899-9cb9-455b-8ca3-ac66c9f30964"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.774975 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-kube-api-access-vhlhc" (OuterVolumeSpecName: "kube-api-access-vhlhc") pod "e5a4d899-9cb9-455b-8ca3-ac66c9f30964" (UID: "e5a4d899-9cb9-455b-8ca3-ac66c9f30964"). InnerVolumeSpecName "kube-api-access-vhlhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.810751 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-config-data" (OuterVolumeSpecName: "config-data") pod "e5a4d899-9cb9-455b-8ca3-ac66c9f30964" (UID: "e5a4d899-9cb9-455b-8ca3-ac66c9f30964"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.818659 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5a4d899-9cb9-455b-8ca3-ac66c9f30964" (UID: "e5a4d899-9cb9-455b-8ca3-ac66c9f30964"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.872296 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.872337 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhlhc\" (UniqueName: \"kubernetes.io/projected/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-kube-api-access-vhlhc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.872355 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.872368 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5a4d899-9cb9-455b-8ca3-ac66c9f30964-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.903600 4803 scope.go:117] "RemoveContainer" containerID="f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6" Mar 20 17:37:15 crc kubenswrapper[4803]: E0320 17:37:15.903962 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6\": container with ID starting with f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6 not found: ID does not exist" containerID="f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.904002 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6"} err="failed to get container status \"f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6\": rpc error: code = NotFound desc = could not find container \"f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6\": container with ID starting with f62c32cada12565a2eaadab273e8b183dc8415139066ae91aa9cd5c0090873e6 not found: ID does not exist" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.904027 4803 scope.go:117] "RemoveContainer" containerID="83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc" Mar 20 17:37:15 crc kubenswrapper[4803]: E0320 17:37:15.904288 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc\": container with ID starting with 83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc not found: ID does not exist" containerID="83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.904333 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.904328 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc"} err="failed to get container status \"83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc\": rpc error: code = NotFound desc = could not find container \"83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc\": container with ID starting with 83093ad64fe949c7f7e6498bbef8ee3d152064831d1684877dc51cd9ac408dfc not found: ID does not exist" Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.973430 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-combined-ca-bundle\") pod \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.973639 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-config-data\") pod \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.973719 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x8b9\" (UniqueName: \"kubernetes.io/projected/12ba7dc7-8933-457c-ae40-4314d7ef54b5-kube-api-access-6x8b9\") pod \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\" (UID: \"12ba7dc7-8933-457c-ae40-4314d7ef54b5\") " Mar 20 17:37:15 crc kubenswrapper[4803]: I0320 17:37:15.980126 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ba7dc7-8933-457c-ae40-4314d7ef54b5-kube-api-access-6x8b9" (OuterVolumeSpecName: "kube-api-access-6x8b9") pod "12ba7dc7-8933-457c-ae40-4314d7ef54b5" (UID: "12ba7dc7-8933-457c-ae40-4314d7ef54b5"). InnerVolumeSpecName "kube-api-access-6x8b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.002433 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ba7dc7-8933-457c-ae40-4314d7ef54b5" (UID: "12ba7dc7-8933-457c-ae40-4314d7ef54b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.002790 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-config-data" (OuterVolumeSpecName: "config-data") pod "12ba7dc7-8933-457c-ae40-4314d7ef54b5" (UID: "12ba7dc7-8933-457c-ae40-4314d7ef54b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.084739 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x8b9\" (UniqueName: \"kubernetes.io/projected/12ba7dc7-8933-457c-ae40-4314d7ef54b5-kube-api-access-6x8b9\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.087658 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.087802 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ba7dc7-8933-457c-ae40-4314d7ef54b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.094873 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.108199 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.121321 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:16 crc kubenswrapper[4803]: E0320 17:37:16.122186 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ba7dc7-8933-457c-ae40-4314d7ef54b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.122209 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ba7dc7-8933-457c-ae40-4314d7ef54b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:37:16 crc kubenswrapper[4803]: E0320 17:37:16.122217 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerName="nova-metadata-metadata" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.122223 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerName="nova-metadata-metadata" Mar 20 17:37:16 crc kubenswrapper[4803]: E0320 17:37:16.122236 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerName="nova-metadata-log" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.122243 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerName="nova-metadata-log" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.122441 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerName="nova-metadata-metadata" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.122455 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" containerName="nova-metadata-log" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.122463 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ba7dc7-8933-457c-ae40-4314d7ef54b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.123547 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.125309 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.125645 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.137050 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.189794 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2qp\" (UniqueName: \"kubernetes.io/projected/b9dc8572-a2f9-4940-a959-9f90656358b2-kube-api-access-mq2qp\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.189857 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8572-a2f9-4940-a959-9f90656358b2-logs\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.190106 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-config-data\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.190185 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.190380 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.291990 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.292088 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.292119 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2qp\" (UniqueName: \"kubernetes.io/projected/b9dc8572-a2f9-4940-a959-9f90656358b2-kube-api-access-mq2qp\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.292148 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8572-a2f9-4940-a959-9f90656358b2-logs\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.292218 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-config-data\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.293700 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8572-a2f9-4940-a959-9f90656358b2-logs\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.296867 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-config-data\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.297226 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.297866 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.308963 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2qp\" (UniqueName: \"kubernetes.io/projected/b9dc8572-a2f9-4940-a959-9f90656358b2-kube-api-access-mq2qp\") pod \"nova-metadata-0\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.444321 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.728331 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"12ba7dc7-8933-457c-ae40-4314d7ef54b5","Type":"ContainerDied","Data":"453e92c0ea7d78ac7f190c022c895c5e8cf10a9794b3c1c5924bc74b90ddeff6"} Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.728693 4803 scope.go:117] "RemoveContainer" containerID="2339521c4ff2f7943a38f4a14fd5670e3f89c1ceea7b7fcac050e7ea5ea97ac1" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.728819 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.734834 4803 generic.go:334] "Generic (PLEG): container finished" podID="6145c682-fa69-4755-a845-883da359fffd" containerID="608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b" exitCode=143 Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.735750 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6145c682-fa69-4755-a845-883da359fffd","Type":"ContainerDied","Data":"608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b"} Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.792481 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.804737 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.826118 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.827270 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.830535 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.830589 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.830661 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.842028 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.860194 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ba7dc7-8933-457c-ae40-4314d7ef54b5" path="/var/lib/kubelet/pods/12ba7dc7-8933-457c-ae40-4314d7ef54b5/volumes" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.860775 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a4d899-9cb9-455b-8ca3-ac66c9f30964" path="/var/lib/kubelet/pods/e5a4d899-9cb9-455b-8ca3-ac66c9f30964/volumes" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.905018 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdh5w\" (UniqueName: \"kubernetes.io/projected/62cc6e29-7f2d-4c3d-b32c-4b906520eded-kube-api-access-bdh5w\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.905066 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.905159 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.905227 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:16 crc kubenswrapper[4803]: I0320 17:37:16.905301 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: W0320 17:37:17.002073 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9dc8572_a2f9_4940_a959_9f90656358b2.slice/crio-3727f1e6cead65ba3fb503760532a21c1e9d36f565712a32009b8c77ebdc0c56 WatchSource:0}: Error finding container 3727f1e6cead65ba3fb503760532a21c1e9d36f565712a32009b8c77ebdc0c56: Status 404 returned error can't find the container with id 3727f1e6cead65ba3fb503760532a21c1e9d36f565712a32009b8c77ebdc0c56 Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.007080 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.007159 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.007206 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdh5w\" (UniqueName: \"kubernetes.io/projected/62cc6e29-7f2d-4c3d-b32c-4b906520eded-kube-api-access-bdh5w\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.007234 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.007298 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.007879 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.013519 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.021200 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.027855 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.030878 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62cc6e29-7f2d-4c3d-b32c-4b906520eded-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.033179 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdh5w\" (UniqueName: \"kubernetes.io/projected/62cc6e29-7f2d-4c3d-b32c-4b906520eded-kube-api-access-bdh5w\") pod \"nova-cell1-novncproxy-0\" (UID: \"62cc6e29-7f2d-4c3d-b32c-4b906520eded\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.146872 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.640184 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 17:37:17 crc kubenswrapper[4803]: W0320 17:37:17.644965 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62cc6e29_7f2d_4c3d_b32c_4b906520eded.slice/crio-c43e1ee2bc11af892f2200414de96093810e2decef8da018a76916c8a98ef795 WatchSource:0}: Error finding container c43e1ee2bc11af892f2200414de96093810e2decef8da018a76916c8a98ef795: Status 404 returned error can't find the container with id c43e1ee2bc11af892f2200414de96093810e2decef8da018a76916c8a98ef795 Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.746183 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9dc8572-a2f9-4940-a959-9f90656358b2","Type":"ContainerStarted","Data":"e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059"} Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.746233 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9dc8572-a2f9-4940-a959-9f90656358b2","Type":"ContainerStarted","Data":"2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9"} Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.746249 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9dc8572-a2f9-4940-a959-9f90656358b2","Type":"ContainerStarted","Data":"3727f1e6cead65ba3fb503760532a21c1e9d36f565712a32009b8c77ebdc0c56"} Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.754603 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerStarted","Data":"9f831ab7a00ac2d07c83a96e12745d2647d714b2976624b478cb611392731731"} Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.754710 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="ceilometer-central-agent" containerID="cri-o://afef0bc2eab04e4663cff0472df588bd41cf087eacf339adfc6a0aeaed353352" gracePeriod=30 Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.754733 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.754777 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="sg-core" containerID="cri-o://ef46562dba43743d6d3b5a87301072fb0c4a928647f7f71bc258530d740ea84d" gracePeriod=30 Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.754798 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="ceilometer-notification-agent" containerID="cri-o://29c149d33bdc0c586f07218cad9624861034639f49e9c544f0106e67d704c690" gracePeriod=30 Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.754830 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="proxy-httpd" containerID="cri-o://9f831ab7a00ac2d07c83a96e12745d2647d714b2976624b478cb611392731731" gracePeriod=30 Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.766711 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.7666948759999999 podStartE2EDuration="1.766694876s" podCreationTimestamp="2026-03-20 17:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:17.760196311 +0000 UTC m=+1247.671788391" watchObservedRunningTime="2026-03-20 17:37:17.766694876 +0000 UTC m=+1247.678286946" Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.771767 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"62cc6e29-7f2d-4c3d-b32c-4b906520eded","Type":"ContainerStarted","Data":"c43e1ee2bc11af892f2200414de96093810e2decef8da018a76916c8a98ef795"} Mar 20 17:37:17 crc kubenswrapper[4803]: I0320 17:37:17.789664 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.87960783 podStartE2EDuration="9.789648065s" podCreationTimestamp="2026-03-20 17:37:08 +0000 UTC" firstStartedPulling="2026-03-20 17:37:09.608124586 +0000 UTC m=+1239.519716666" lastFinishedPulling="2026-03-20 17:37:16.518164831 +0000 UTC m=+1246.429756901" observedRunningTime="2026-03-20 17:37:17.784866216 +0000 UTC m=+1247.696458286" watchObservedRunningTime="2026-03-20 17:37:17.789648065 +0000 UTC m=+1247.701240125" Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.791457 4803 generic.go:334] "Generic (PLEG): container finished" podID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerID="9f831ab7a00ac2d07c83a96e12745d2647d714b2976624b478cb611392731731" exitCode=0 Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.791752 4803 generic.go:334] "Generic (PLEG): container finished" podID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerID="ef46562dba43743d6d3b5a87301072fb0c4a928647f7f71bc258530d740ea84d" exitCode=2 Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.791760 4803 generic.go:334] "Generic (PLEG): container finished" podID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerID="29c149d33bdc0c586f07218cad9624861034639f49e9c544f0106e67d704c690" exitCode=0 Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.791767 4803 generic.go:334] "Generic (PLEG): container finished" podID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerID="afef0bc2eab04e4663cff0472df588bd41cf087eacf339adfc6a0aeaed353352" exitCode=0 Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.791610 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerDied","Data":"9f831ab7a00ac2d07c83a96e12745d2647d714b2976624b478cb611392731731"} Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.791818 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerDied","Data":"ef46562dba43743d6d3b5a87301072fb0c4a928647f7f71bc258530d740ea84d"} Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.791829 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerDied","Data":"29c149d33bdc0c586f07218cad9624861034639f49e9c544f0106e67d704c690"} Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.791840 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerDied","Data":"afef0bc2eab04e4663cff0472df588bd41cf087eacf339adfc6a0aeaed353352"} Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.794583 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"62cc6e29-7f2d-4c3d-b32c-4b906520eded","Type":"ContainerStarted","Data":"68d6b627ec44835376740c02918adfc8998744a56c916427f49cb53060563761"} Mar 20 17:37:18 crc kubenswrapper[4803]: I0320 17:37:18.838413 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.838382927 podStartE2EDuration="2.838382927s" podCreationTimestamp="2026-03-20 17:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:18.819877488 +0000 UTC m=+1248.731469568" watchObservedRunningTime="2026-03-20 17:37:18.838382927 +0000 UTC m=+1248.749975037" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.096954 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160342 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-log-httpd\") pod \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160445 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-ceilometer-tls-certs\") pod \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160470 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-scripts\") pod \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160542 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-combined-ca-bundle\") pod \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160558 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-config-data\") pod \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160582 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-run-httpd\") pod \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160608 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-sg-core-conf-yaml\") pod \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160660 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djggd\" (UniqueName: \"kubernetes.io/projected/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-kube-api-access-djggd\") pod \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\" (UID: \"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.160896 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" (UID: "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.161014 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" (UID: "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.161825 4803 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.161845 4803 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.166942 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-scripts" (OuterVolumeSpecName: "scripts") pod "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" (UID: "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.167870 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-kube-api-access-djggd" (OuterVolumeSpecName: "kube-api-access-djggd") pod "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" (UID: "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4"). InnerVolumeSpecName "kube-api-access-djggd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.222455 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" (UID: "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.256130 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" (UID: "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.257642 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" (UID: "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.263338 4803 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.263365 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.263374 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.263383 4803 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.263392 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djggd\" (UniqueName: \"kubernetes.io/projected/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-kube-api-access-djggd\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.280720 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.303649 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-config-data" (OuterVolumeSpecName: "config-data") pod "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" (UID: "210fe4bd-2d7a-4980-9e65-5e7e2abab8b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.365206 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-combined-ca-bundle\") pod \"6145c682-fa69-4755-a845-883da359fffd\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.365364 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5bj6\" (UniqueName: \"kubernetes.io/projected/6145c682-fa69-4755-a845-883da359fffd-kube-api-access-p5bj6\") pod \"6145c682-fa69-4755-a845-883da359fffd\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.365565 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6145c682-fa69-4755-a845-883da359fffd-logs\") pod \"6145c682-fa69-4755-a845-883da359fffd\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.365734 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-config-data\") pod \"6145c682-fa69-4755-a845-883da359fffd\" (UID: \"6145c682-fa69-4755-a845-883da359fffd\") " Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.366854 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.367046 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6145c682-fa69-4755-a845-883da359fffd-logs" (OuterVolumeSpecName: "logs") pod "6145c682-fa69-4755-a845-883da359fffd" (UID: "6145c682-fa69-4755-a845-883da359fffd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.377269 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6145c682-fa69-4755-a845-883da359fffd-kube-api-access-p5bj6" (OuterVolumeSpecName: "kube-api-access-p5bj6") pod "6145c682-fa69-4755-a845-883da359fffd" (UID: "6145c682-fa69-4755-a845-883da359fffd"). InnerVolumeSpecName "kube-api-access-p5bj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.396515 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6145c682-fa69-4755-a845-883da359fffd" (UID: "6145c682-fa69-4755-a845-883da359fffd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.402680 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-config-data" (OuterVolumeSpecName: "config-data") pod "6145c682-fa69-4755-a845-883da359fffd" (UID: "6145c682-fa69-4755-a845-883da359fffd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.468747 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6145c682-fa69-4755-a845-883da359fffd-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.468787 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.468800 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6145c682-fa69-4755-a845-883da359fffd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.468813 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5bj6\" (UniqueName: \"kubernetes.io/projected/6145c682-fa69-4755-a845-883da359fffd-kube-api-access-p5bj6\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.803675 4803 generic.go:334] "Generic (PLEG): container finished" podID="6145c682-fa69-4755-a845-883da359fffd" containerID="47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2" exitCode=0 Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.803871 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6145c682-fa69-4755-a845-883da359fffd","Type":"ContainerDied","Data":"47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2"} Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.803938 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.804023 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6145c682-fa69-4755-a845-883da359fffd","Type":"ContainerDied","Data":"5b47ab503812b1aecc25785b49198d6b8fab1ceeec621d239cc25520b3ab4e34"} Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.804044 4803 scope.go:117] "RemoveContainer" containerID="47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.810035 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.813435 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"210fe4bd-2d7a-4980-9e65-5e7e2abab8b4","Type":"ContainerDied","Data":"f093275566d728b23d92bf0e241625c05c6d745ef127aa1b2c5acad5f2e9c59f"} Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.830932 4803 scope.go:117] "RemoveContainer" containerID="608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.865703 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.882510 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.904768 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.907873 4803 scope.go:117] "RemoveContainer" containerID="47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2" Mar 20 17:37:19 crc kubenswrapper[4803]: E0320 17:37:19.914740 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2\": container with ID starting with 47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2 not found: ID does not exist" containerID="47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.914783 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2"} err="failed to get container status \"47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2\": rpc error: code = NotFound desc = could not find container \"47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2\": container with ID starting with 47911531d82bb8f258631306349d22c128e07d488c3ffc4bdd8c3671f6720fe2 not found: ID does not exist" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.914810 4803 scope.go:117] "RemoveContainer" containerID="608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b" Mar 20 17:37:19 crc kubenswrapper[4803]: E0320 17:37:19.915786 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b\": container with ID starting with 608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b not found: ID does not exist" containerID="608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.915843 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b"} err="failed to get container status \"608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b\": rpc error: code = NotFound desc = could not find container \"608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b\": container with ID starting with 608dbe8ef2e174220bd2e4544159ae00f980bd052e2a1fd2e37b5955c297ef8b not found: ID does not exist" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.915869 4803 scope.go:117] "RemoveContainer" containerID="9f831ab7a00ac2d07c83a96e12745d2647d714b2976624b478cb611392731731" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.916545 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.926657 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:19 crc kubenswrapper[4803]: E0320 17:37:19.927098 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="proxy-httpd" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927121 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="proxy-httpd" Mar 20 17:37:19 crc kubenswrapper[4803]: E0320 17:37:19.927144 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-api" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927155 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-api" Mar 20 17:37:19 crc kubenswrapper[4803]: E0320 17:37:19.927169 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-log" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927177 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-log" Mar 20 17:37:19 crc kubenswrapper[4803]: E0320 17:37:19.927201 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="sg-core" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927210 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="sg-core" Mar 20 17:37:19 crc kubenswrapper[4803]: E0320 17:37:19.927233 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="ceilometer-notification-agent" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927241 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="ceilometer-notification-agent" Mar 20 17:37:19 crc kubenswrapper[4803]: E0320 17:37:19.927257 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="ceilometer-central-agent" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927264 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="ceilometer-central-agent" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927479 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-log" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927494 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="sg-core" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927510 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="proxy-httpd" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927549 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="ceilometer-notification-agent" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927577 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6145c682-fa69-4755-a845-883da359fffd" containerName="nova-api-api" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.927590 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" containerName="ceilometer-central-agent" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.928993 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.930879 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.934039 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.934963 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.937983 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.948465 4803 scope.go:117] "RemoveContainer" containerID="ef46562dba43743d6d3b5a87301072fb0c4a928647f7f71bc258530d740ea84d" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.950753 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.952867 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.955090 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.955167 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.955246 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.959095 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:19 crc kubenswrapper[4803]: I0320 17:37:19.974487 4803 scope.go:117] "RemoveContainer" containerID="29c149d33bdc0c586f07218cad9624861034639f49e9c544f0106e67d704c690" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.004578 4803 scope.go:117] "RemoveContainer" containerID="afef0bc2eab04e4663cff0472df588bd41cf087eacf339adfc6a0aeaed353352" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.078840 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.078886 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-scripts\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.078940 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-config-data\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079032 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079068 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079130 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-logs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079167 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079190 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a7417bc-8901-4f01-ae88-5b304c7371a9-run-httpd\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079386 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktpzg\" (UniqueName: \"kubernetes.io/projected/0a7417bc-8901-4f01-ae88-5b304c7371a9-kube-api-access-ktpzg\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079458 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9497l\" (UniqueName: \"kubernetes.io/projected/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-kube-api-access-9497l\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079491 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079567 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079591 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a7417bc-8901-4f01-ae88-5b304c7371a9-log-httpd\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.079622 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-config-data\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.180971 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-logs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181312 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181335 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a7417bc-8901-4f01-ae88-5b304c7371a9-run-httpd\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181374 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktpzg\" (UniqueName: \"kubernetes.io/projected/0a7417bc-8901-4f01-ae88-5b304c7371a9-kube-api-access-ktpzg\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181402 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9497l\" (UniqueName: \"kubernetes.io/projected/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-kube-api-access-9497l\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181440 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181468 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181481 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a7417bc-8901-4f01-ae88-5b304c7371a9-log-httpd\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181496 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-config-data\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181519 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181553 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-logs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181576 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-scripts\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181785 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-config-data\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181838 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.181855 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.182099 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a7417bc-8901-4f01-ae88-5b304c7371a9-run-httpd\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.186189 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0a7417bc-8901-4f01-ae88-5b304c7371a9-log-httpd\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.193147 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-public-tls-certs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.193311 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.193555 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.193592 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-scripts\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.193659 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.193725 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.197241 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-config-data\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.197347 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9497l\" (UniqueName: \"kubernetes.io/projected/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-kube-api-access-9497l\") pod \"nova-api-0\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.197597 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-config-data\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.197745 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0a7417bc-8901-4f01-ae88-5b304c7371a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.223770 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktpzg\" (UniqueName: \"kubernetes.io/projected/0a7417bc-8901-4f01-ae88-5b304c7371a9-kube-api-access-ktpzg\") pod \"ceilometer-0\" (UID: \"0a7417bc-8901-4f01-ae88-5b304c7371a9\") " pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.252177 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.276862 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.744943 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.798574 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 17:37:20 crc kubenswrapper[4803]: W0320 17:37:20.802445 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a7417bc_8901_4f01_ae88_5b304c7371a9.slice/crio-4aaddced23e987508691dc501ab9ad671e5349becb2527ac462f315a2ca27be7 WatchSource:0}: Error finding container 4aaddced23e987508691dc501ab9ad671e5349becb2527ac462f315a2ca27be7: Status 404 returned error can't find the container with id 4aaddced23e987508691dc501ab9ad671e5349becb2527ac462f315a2ca27be7 Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.825677 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee","Type":"ContainerStarted","Data":"bdbb567d3094a30e3912aa2d9472e9f36d96bd2e8e9eda26de029f16aece44d1"} Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.827515 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a7417bc-8901-4f01-ae88-5b304c7371a9","Type":"ContainerStarted","Data":"4aaddced23e987508691dc501ab9ad671e5349becb2527ac462f315a2ca27be7"} Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.859891 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210fe4bd-2d7a-4980-9e65-5e7e2abab8b4" path="/var/lib/kubelet/pods/210fe4bd-2d7a-4980-9e65-5e7e2abab8b4/volumes" Mar 20 17:37:20 crc kubenswrapper[4803]: I0320 17:37:20.861416 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6145c682-fa69-4755-a845-883da359fffd" path="/var/lib/kubelet/pods/6145c682-fa69-4755-a845-883da359fffd/volumes" Mar 20 17:37:21 crc kubenswrapper[4803]: I0320 17:37:21.837467 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a7417bc-8901-4f01-ae88-5b304c7371a9","Type":"ContainerStarted","Data":"57e7beba103ba487876639773c2be10695966b50debbb88bcd09040bf8cdc4fa"} Mar 20 17:37:21 crc kubenswrapper[4803]: I0320 17:37:21.839468 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee","Type":"ContainerStarted","Data":"668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7"} Mar 20 17:37:21 crc kubenswrapper[4803]: I0320 17:37:21.839509 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee","Type":"ContainerStarted","Data":"ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795"} Mar 20 17:37:21 crc kubenswrapper[4803]: I0320 17:37:21.867776 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.867758288 podStartE2EDuration="2.867758288s" podCreationTimestamp="2026-03-20 17:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:21.857024149 +0000 UTC m=+1251.768616229" watchObservedRunningTime="2026-03-20 17:37:21.867758288 +0000 UTC m=+1251.779350368" Mar 20 17:37:22 crc kubenswrapper[4803]: I0320 17:37:22.146992 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:22 crc kubenswrapper[4803]: I0320 17:37:22.864206 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a7417bc-8901-4f01-ae88-5b304c7371a9","Type":"ContainerStarted","Data":"4b7335e65e23f6904cf969c0b4811e7506d072bd6599d0129f42610eaedec079"} Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.269439 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.362133 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-bcc55"] Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.362392 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" podUID="b6477cd4-502c-467c-8cc1-b706f62f249d" containerName="dnsmasq-dns" containerID="cri-o://81a6ad1f271b17b2925da46daa67eb84639642beee0928c9d4179daf0501e2ed" gracePeriod=10 Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.863279 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a7417bc-8901-4f01-ae88-5b304c7371a9","Type":"ContainerStarted","Data":"2cdd9e8687c6c7ea285a8141a0ffc2aa05df0e9899baa37554f7535a23ff93a1"} Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.864556 4803 generic.go:334] "Generic (PLEG): container finished" podID="b6477cd4-502c-467c-8cc1-b706f62f249d" containerID="81a6ad1f271b17b2925da46daa67eb84639642beee0928c9d4179daf0501e2ed" exitCode=0 Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.864585 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" event={"ID":"b6477cd4-502c-467c-8cc1-b706f62f249d","Type":"ContainerDied","Data":"81a6ad1f271b17b2925da46daa67eb84639642beee0928c9d4179daf0501e2ed"} Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.864599 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" event={"ID":"b6477cd4-502c-467c-8cc1-b706f62f249d","Type":"ContainerDied","Data":"6e1eef29f8347a3101854feb44063b61577bb4e6b4eab4090bb0ba97f446d385"} Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.864609 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e1eef29f8347a3101854feb44063b61577bb4e6b4eab4090bb0ba97f446d385" Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.909019 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.969639 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsfdx\" (UniqueName: \"kubernetes.io/projected/b6477cd4-502c-467c-8cc1-b706f62f249d-kube-api-access-bsfdx\") pod \"b6477cd4-502c-467c-8cc1-b706f62f249d\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.969719 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-swift-storage-0\") pod \"b6477cd4-502c-467c-8cc1-b706f62f249d\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.969857 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-nb\") pod \"b6477cd4-502c-467c-8cc1-b706f62f249d\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.969886 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-config\") pod \"b6477cd4-502c-467c-8cc1-b706f62f249d\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.969911 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-sb\") pod \"b6477cd4-502c-467c-8cc1-b706f62f249d\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.969967 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-svc\") pod \"b6477cd4-502c-467c-8cc1-b706f62f249d\" (UID: \"b6477cd4-502c-467c-8cc1-b706f62f249d\") " Mar 20 17:37:23 crc kubenswrapper[4803]: I0320 17:37:23.984408 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6477cd4-502c-467c-8cc1-b706f62f249d-kube-api-access-bsfdx" (OuterVolumeSpecName: "kube-api-access-bsfdx") pod "b6477cd4-502c-467c-8cc1-b706f62f249d" (UID: "b6477cd4-502c-467c-8cc1-b706f62f249d"). InnerVolumeSpecName "kube-api-access-bsfdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.013873 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b6477cd4-502c-467c-8cc1-b706f62f249d" (UID: "b6477cd4-502c-467c-8cc1-b706f62f249d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.014465 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b6477cd4-502c-467c-8cc1-b706f62f249d" (UID: "b6477cd4-502c-467c-8cc1-b706f62f249d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.016462 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-config" (OuterVolumeSpecName: "config") pod "b6477cd4-502c-467c-8cc1-b706f62f249d" (UID: "b6477cd4-502c-467c-8cc1-b706f62f249d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.018618 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b6477cd4-502c-467c-8cc1-b706f62f249d" (UID: "b6477cd4-502c-467c-8cc1-b706f62f249d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.031913 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b6477cd4-502c-467c-8cc1-b706f62f249d" (UID: "b6477cd4-502c-467c-8cc1-b706f62f249d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.072177 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.072214 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.072227 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.072241 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.072255 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsfdx\" (UniqueName: \"kubernetes.io/projected/b6477cd4-502c-467c-8cc1-b706f62f249d-kube-api-access-bsfdx\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.072268 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b6477cd4-502c-467c-8cc1-b706f62f249d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.877550 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0a7417bc-8901-4f01-ae88-5b304c7371a9","Type":"ContainerStarted","Data":"fe46f6e849d9b3bcc0afd0ca4a90de8c3da9e4098d328eb0d5f901bdfdb62ec8"} Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.877861 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.877750 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-bcc55" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.914149 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.168953242 podStartE2EDuration="5.914127166s" podCreationTimestamp="2026-03-20 17:37:19 +0000 UTC" firstStartedPulling="2026-03-20 17:37:20.81660725 +0000 UTC m=+1250.728199320" lastFinishedPulling="2026-03-20 17:37:24.561781164 +0000 UTC m=+1254.473373244" observedRunningTime="2026-03-20 17:37:24.902302068 +0000 UTC m=+1254.813894158" watchObservedRunningTime="2026-03-20 17:37:24.914127166 +0000 UTC m=+1254.825719246" Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.927754 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-bcc55"] Mar 20 17:37:24 crc kubenswrapper[4803]: I0320 17:37:24.944925 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-bcc55"] Mar 20 17:37:26 crc kubenswrapper[4803]: I0320 17:37:26.445624 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:37:26 crc kubenswrapper[4803]: I0320 17:37:26.447438 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:37:26 crc kubenswrapper[4803]: I0320 17:37:26.862320 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6477cd4-502c-467c-8cc1-b706f62f249d" path="/var/lib/kubelet/pods/b6477cd4-502c-467c-8cc1-b706f62f249d/volumes" Mar 20 17:37:27 crc kubenswrapper[4803]: I0320 17:37:27.147794 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:27 crc kubenswrapper[4803]: I0320 17:37:27.181762 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:27 crc kubenswrapper[4803]: I0320 17:37:27.464879 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:27 crc kubenswrapper[4803]: I0320 17:37:27.464887 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:27 crc kubenswrapper[4803]: I0320 17:37:27.940385 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.137211 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-v5q8f"] Mar 20 17:37:28 crc kubenswrapper[4803]: E0320 17:37:28.137698 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6477cd4-502c-467c-8cc1-b706f62f249d" containerName="init" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.137719 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6477cd4-502c-467c-8cc1-b706f62f249d" containerName="init" Mar 20 17:37:28 crc kubenswrapper[4803]: E0320 17:37:28.137739 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6477cd4-502c-467c-8cc1-b706f62f249d" containerName="dnsmasq-dns" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.137747 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6477cd4-502c-467c-8cc1-b706f62f249d" containerName="dnsmasq-dns" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.137987 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6477cd4-502c-467c-8cc1-b706f62f249d" containerName="dnsmasq-dns" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.138723 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.141388 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.141905 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.163585 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v5q8f"] Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.265335 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-scripts\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.265463 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.265516 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn55m\" (UniqueName: \"kubernetes.io/projected/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-kube-api-access-xn55m\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.265628 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-config-data\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.366945 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.367838 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn55m\" (UniqueName: \"kubernetes.io/projected/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-kube-api-access-xn55m\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.368185 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-config-data\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.368509 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-scripts\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.375125 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-scripts\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.377324 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.384392 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-config-data\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.389165 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn55m\" (UniqueName: \"kubernetes.io/projected/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-kube-api-access-xn55m\") pod \"nova-cell1-cell-mapping-v5q8f\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.458141 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.898651 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v5q8f"] Mar 20 17:37:28 crc kubenswrapper[4803]: W0320 17:37:28.906640 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12ac9cf1_3fd1_49db_a10d_eedb31f9c82d.slice/crio-f98417c8dacb38d767299e834df6e548b90c3aee755df9347f8b5b0997b262e3 WatchSource:0}: Error finding container f98417c8dacb38d767299e834df6e548b90c3aee755df9347f8b5b0997b262e3: Status 404 returned error can't find the container with id f98417c8dacb38d767299e834df6e548b90c3aee755df9347f8b5b0997b262e3 Mar 20 17:37:28 crc kubenswrapper[4803]: I0320 17:37:28.934648 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v5q8f" event={"ID":"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d","Type":"ContainerStarted","Data":"f98417c8dacb38d767299e834df6e548b90c3aee755df9347f8b5b0997b262e3"} Mar 20 17:37:29 crc kubenswrapper[4803]: I0320 17:37:29.943933 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v5q8f" event={"ID":"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d","Type":"ContainerStarted","Data":"ccc886527813a854c8522a756637de08cff00151ac71b7666ffe61f174094326"} Mar 20 17:37:29 crc kubenswrapper[4803]: I0320 17:37:29.965478 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-v5q8f" podStartSLOduration=1.965463117 podStartE2EDuration="1.965463117s" podCreationTimestamp="2026-03-20 17:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:29.957429941 +0000 UTC m=+1259.869022011" watchObservedRunningTime="2026-03-20 17:37:29.965463117 +0000 UTC m=+1259.877055187" Mar 20 17:37:30 crc kubenswrapper[4803]: I0320 17:37:30.253331 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:37:30 crc kubenswrapper[4803]: I0320 17:37:30.253398 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:37:31 crc kubenswrapper[4803]: I0320 17:37:31.271667 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:31 crc kubenswrapper[4803]: I0320 17:37:31.272144 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:33 crc kubenswrapper[4803]: I0320 17:37:33.985136 4803 generic.go:334] "Generic (PLEG): container finished" podID="12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" containerID="ccc886527813a854c8522a756637de08cff00151ac71b7666ffe61f174094326" exitCode=0 Mar 20 17:37:33 crc kubenswrapper[4803]: I0320 17:37:33.985226 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v5q8f" event={"ID":"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d","Type":"ContainerDied","Data":"ccc886527813a854c8522a756637de08cff00151ac71b7666ffe61f174094326"} Mar 20 17:37:34 crc kubenswrapper[4803]: I0320 17:37:34.445242 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:37:34 crc kubenswrapper[4803]: I0320 17:37:34.445304 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.418670 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.514825 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-combined-ca-bundle\") pod \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.515020 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-scripts\") pod \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.515197 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-config-data\") pod \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.515248 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn55m\" (UniqueName: \"kubernetes.io/projected/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-kube-api-access-xn55m\") pod \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\" (UID: \"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d\") " Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.542712 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-scripts" (OuterVolumeSpecName: "scripts") pod "12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" (UID: "12ac9cf1-3fd1-49db-a10d-eedb31f9c82d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.543011 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-kube-api-access-xn55m" (OuterVolumeSpecName: "kube-api-access-xn55m") pod "12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" (UID: "12ac9cf1-3fd1-49db-a10d-eedb31f9c82d"). InnerVolumeSpecName "kube-api-access-xn55m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.565660 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" (UID: "12ac9cf1-3fd1-49db-a10d-eedb31f9c82d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.607681 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-config-data" (OuterVolumeSpecName: "config-data") pod "12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" (UID: "12ac9cf1-3fd1-49db-a10d-eedb31f9c82d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.624629 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.624673 4803 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.624682 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:35 crc kubenswrapper[4803]: I0320 17:37:35.624691 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn55m\" (UniqueName: \"kubernetes.io/projected/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d-kube-api-access-xn55m\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.010483 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v5q8f" Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.010382 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v5q8f" event={"ID":"12ac9cf1-3fd1-49db-a10d-eedb31f9c82d","Type":"ContainerDied","Data":"f98417c8dacb38d767299e834df6e548b90c3aee755df9347f8b5b0997b262e3"} Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.014675 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f98417c8dacb38d767299e834df6e548b90c3aee755df9347f8b5b0997b262e3" Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.241312 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.241682 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-log" containerID="cri-o://ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795" gracePeriod=30 Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.242030 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-api" containerID="cri-o://668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7" gracePeriod=30 Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.257284 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.257546 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="12e443ee-a472-451b-8243-38a405aacc73" containerName="nova-scheduler-scheduler" containerID="cri-o://5cb960c5d486708e2feea8bbaca388c49dfbc83ed666e179c283fcd3badbf2ae" gracePeriod=30 Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.270403 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.270669 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-log" containerID="cri-o://2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9" gracePeriod=30 Mar 20 17:37:36 crc kubenswrapper[4803]: I0320 17:37:36.271428 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-metadata" containerID="cri-o://e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059" gracePeriod=30 Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.027746 4803 generic.go:334] "Generic (PLEG): container finished" podID="12e443ee-a472-451b-8243-38a405aacc73" containerID="5cb960c5d486708e2feea8bbaca388c49dfbc83ed666e179c283fcd3badbf2ae" exitCode=0 Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.027839 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12e443ee-a472-451b-8243-38a405aacc73","Type":"ContainerDied","Data":"5cb960c5d486708e2feea8bbaca388c49dfbc83ed666e179c283fcd3badbf2ae"} Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.030273 4803 generic.go:334] "Generic (PLEG): container finished" podID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerID="ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795" exitCode=143 Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.030332 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee","Type":"ContainerDied","Data":"ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795"} Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.032712 4803 generic.go:334] "Generic (PLEG): container finished" podID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerID="2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9" exitCode=143 Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.032748 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9dc8572-a2f9-4940-a959-9f90656358b2","Type":"ContainerDied","Data":"2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9"} Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.190492 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.264178 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-config-data\") pod \"12e443ee-a472-451b-8243-38a405aacc73\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.264327 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-combined-ca-bundle\") pod \"12e443ee-a472-451b-8243-38a405aacc73\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.264401 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv27g\" (UniqueName: \"kubernetes.io/projected/12e443ee-a472-451b-8243-38a405aacc73-kube-api-access-rv27g\") pod \"12e443ee-a472-451b-8243-38a405aacc73\" (UID: \"12e443ee-a472-451b-8243-38a405aacc73\") " Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.271638 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e443ee-a472-451b-8243-38a405aacc73-kube-api-access-rv27g" (OuterVolumeSpecName: "kube-api-access-rv27g") pod "12e443ee-a472-451b-8243-38a405aacc73" (UID: "12e443ee-a472-451b-8243-38a405aacc73"). InnerVolumeSpecName "kube-api-access-rv27g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.299669 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-config-data" (OuterVolumeSpecName: "config-data") pod "12e443ee-a472-451b-8243-38a405aacc73" (UID: "12e443ee-a472-451b-8243-38a405aacc73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.299689 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12e443ee-a472-451b-8243-38a405aacc73" (UID: "12e443ee-a472-451b-8243-38a405aacc73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.366552 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv27g\" (UniqueName: \"kubernetes.io/projected/12e443ee-a472-451b-8243-38a405aacc73-kube-api-access-rv27g\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.366587 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:37 crc kubenswrapper[4803]: I0320 17:37:37.366599 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12e443ee-a472-451b-8243-38a405aacc73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.045812 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"12e443ee-a472-451b-8243-38a405aacc73","Type":"ContainerDied","Data":"81316e82965aacf419aa44b13a00ab66e688350e2ee72ecc513f5b7f88e31d52"} Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.045882 4803 scope.go:117] "RemoveContainer" containerID="5cb960c5d486708e2feea8bbaca388c49dfbc83ed666e179c283fcd3badbf2ae" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.045910 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.110723 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.125180 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.135631 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:37:38 crc kubenswrapper[4803]: E0320 17:37:38.136035 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" containerName="nova-manage" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.136051 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" containerName="nova-manage" Mar 20 17:37:38 crc kubenswrapper[4803]: E0320 17:37:38.136065 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e443ee-a472-451b-8243-38a405aacc73" containerName="nova-scheduler-scheduler" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.136072 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e443ee-a472-451b-8243-38a405aacc73" containerName="nova-scheduler-scheduler" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.136235 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" containerName="nova-manage" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.136249 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e443ee-a472-451b-8243-38a405aacc73" containerName="nova-scheduler-scheduler" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.136864 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.139207 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.147775 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.252960 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.253018 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.290603 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51598f47-53eb-4d5d-917d-3655d7e200e8-config-data\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.290642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51598f47-53eb-4d5d-917d-3655d7e200e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.290952 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpl7d\" (UniqueName: \"kubernetes.io/projected/51598f47-53eb-4d5d-917d-3655d7e200e8-kube-api-access-mpl7d\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.393293 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51598f47-53eb-4d5d-917d-3655d7e200e8-config-data\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.393345 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51598f47-53eb-4d5d-917d-3655d7e200e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.393436 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpl7d\" (UniqueName: \"kubernetes.io/projected/51598f47-53eb-4d5d-917d-3655d7e200e8-kube-api-access-mpl7d\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.399451 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51598f47-53eb-4d5d-917d-3655d7e200e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.411982 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51598f47-53eb-4d5d-917d-3655d7e200e8-config-data\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.418964 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpl7d\" (UniqueName: \"kubernetes.io/projected/51598f47-53eb-4d5d-917d-3655d7e200e8-kube-api-access-mpl7d\") pod \"nova-scheduler-0\" (UID: \"51598f47-53eb-4d5d-917d-3655d7e200e8\") " pod="openstack/nova-scheduler-0" Mar 20 17:37:38 crc kubenswrapper[4803]: I0320 17:37:38.456116 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 17:37:39 crc kubenswrapper[4803]: I0320 17:37:38.877898 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e443ee-a472-451b-8243-38a405aacc73" path="/var/lib/kubelet/pods/12e443ee-a472-451b-8243-38a405aacc73/volumes" Mar 20 17:37:39 crc kubenswrapper[4803]: I0320 17:37:39.649729 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 17:37:39 crc kubenswrapper[4803]: I0320 17:37:39.868112 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:37:39 crc kubenswrapper[4803]: I0320 17:37:39.956079 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.037311 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-public-tls-certs\") pod \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.037767 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-internal-tls-certs\") pod \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.037824 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-combined-ca-bundle\") pod \"b9dc8572-a2f9-4940-a959-9f90656358b2\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.037845 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-config-data\") pod \"b9dc8572-a2f9-4940-a959-9f90656358b2\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.037865 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9497l\" (UniqueName: \"kubernetes.io/projected/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-kube-api-access-9497l\") pod \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.037932 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-logs\") pod \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.037957 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2qp\" (UniqueName: \"kubernetes.io/projected/b9dc8572-a2f9-4940-a959-9f90656358b2-kube-api-access-mq2qp\") pod \"b9dc8572-a2f9-4940-a959-9f90656358b2\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.038007 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-config-data\") pod \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.038028 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8572-a2f9-4940-a959-9f90656358b2-logs\") pod \"b9dc8572-a2f9-4940-a959-9f90656358b2\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.038051 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-combined-ca-bundle\") pod \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\" (UID: \"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.038085 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-nova-metadata-tls-certs\") pod \"b9dc8572-a2f9-4940-a959-9f90656358b2\" (UID: \"b9dc8572-a2f9-4940-a959-9f90656358b2\") " Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.038502 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-logs" (OuterVolumeSpecName: "logs") pod "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" (UID: "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.038854 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9dc8572-a2f9-4940-a959-9f90656358b2-logs" (OuterVolumeSpecName: "logs") pod "b9dc8572-a2f9-4940-a959-9f90656358b2" (UID: "b9dc8572-a2f9-4940-a959-9f90656358b2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.041036 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9dc8572-a2f9-4940-a959-9f90656358b2-kube-api-access-mq2qp" (OuterVolumeSpecName: "kube-api-access-mq2qp") pod "b9dc8572-a2f9-4940-a959-9f90656358b2" (UID: "b9dc8572-a2f9-4940-a959-9f90656358b2"). InnerVolumeSpecName "kube-api-access-mq2qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.041129 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-kube-api-access-9497l" (OuterVolumeSpecName: "kube-api-access-9497l") pod "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" (UID: "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee"). InnerVolumeSpecName "kube-api-access-9497l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.060323 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-config-data" (OuterVolumeSpecName: "config-data") pod "b9dc8572-a2f9-4940-a959-9f90656358b2" (UID: "b9dc8572-a2f9-4940-a959-9f90656358b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.062134 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" (UID: "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.062906 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-config-data" (OuterVolumeSpecName: "config-data") pod "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" (UID: "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.072341 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9dc8572-a2f9-4940-a959-9f90656358b2" (UID: "b9dc8572-a2f9-4940-a959-9f90656358b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.072700 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"51598f47-53eb-4d5d-917d-3655d7e200e8","Type":"ContainerStarted","Data":"fa687b52a6501876b089017c73ccf7763974859c8fdec21edbc6a22610187783"} Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.072749 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"51598f47-53eb-4d5d-917d-3655d7e200e8","Type":"ContainerStarted","Data":"bca14508e6ac270b0566062db8fa408d87b5c88dd0ef8a64915752582f1daca3"} Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.074292 4803 generic.go:334] "Generic (PLEG): container finished" podID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerID="e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.074336 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.074512 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9dc8572-a2f9-4940-a959-9f90656358b2","Type":"ContainerDied","Data":"e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059"} Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.074782 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9dc8572-a2f9-4940-a959-9f90656358b2","Type":"ContainerDied","Data":"3727f1e6cead65ba3fb503760532a21c1e9d36f565712a32009b8c77ebdc0c56"} Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.074826 4803 scope.go:117] "RemoveContainer" containerID="e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.076506 4803 generic.go:334] "Generic (PLEG): container finished" podID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerID="668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7" exitCode=0 Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.076563 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee","Type":"ContainerDied","Data":"668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7"} Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.076585 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee","Type":"ContainerDied","Data":"bdbb567d3094a30e3912aa2d9472e9f36d96bd2e8e9eda26de029f16aece44d1"} Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.076621 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.086826 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" (UID: "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.090942 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" (UID: "0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.098361 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.098340245 podStartE2EDuration="2.098340245s" podCreationTimestamp="2026-03-20 17:37:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:40.088826709 +0000 UTC m=+1270.000418779" watchObservedRunningTime="2026-03-20 17:37:40.098340245 +0000 UTC m=+1270.009932315" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.101372 4803 scope.go:117] "RemoveContainer" containerID="2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.106242 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b9dc8572-a2f9-4940-a959-9f90656358b2" (UID: "b9dc8572-a2f9-4940-a959-9f90656358b2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.120006 4803 scope.go:117] "RemoveContainer" containerID="e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059" Mar 20 17:37:40 crc kubenswrapper[4803]: E0320 17:37:40.120494 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059\": container with ID starting with e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059 not found: ID does not exist" containerID="e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.120547 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059"} err="failed to get container status \"e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059\": rpc error: code = NotFound desc = could not find container \"e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059\": container with ID starting with e148ee395383335a7b9f94c5f1fa20b7ed24a7faa7a945a247e9c6c341119059 not found: ID does not exist" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.120577 4803 scope.go:117] "RemoveContainer" containerID="2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9" Mar 20 17:37:40 crc kubenswrapper[4803]: E0320 17:37:40.120924 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9\": container with ID starting with 2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9 not found: ID does not exist" containerID="2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.120945 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9"} err="failed to get container status \"2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9\": rpc error: code = NotFound desc = could not find container \"2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9\": container with ID starting with 2dbc640e4d01b0e7e3e237292436f1394ca99f72affd5a4b22675e46521aaeb9 not found: ID does not exist" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.120958 4803 scope.go:117] "RemoveContainer" containerID="668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139874 4803 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139913 4803 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139924 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139934 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139945 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9497l\" (UniqueName: \"kubernetes.io/projected/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-kube-api-access-9497l\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139954 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139964 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2qp\" (UniqueName: \"kubernetes.io/projected/b9dc8572-a2f9-4940-a959-9f90656358b2-kube-api-access-mq2qp\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139972 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139981 4803 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9dc8572-a2f9-4940-a959-9f90656358b2-logs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139990 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.139999 4803 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9dc8572-a2f9-4940-a959-9f90656358b2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.142701 4803 scope.go:117] "RemoveContainer" containerID="ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.159820 4803 scope.go:117] "RemoveContainer" containerID="668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7" Mar 20 17:37:40 crc kubenswrapper[4803]: E0320 17:37:40.160259 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7\": container with ID starting with 668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7 not found: ID does not exist" containerID="668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.160317 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7"} err="failed to get container status \"668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7\": rpc error: code = NotFound desc = could not find container \"668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7\": container with ID starting with 668c38e9d770b8e62a877eef9de05e34488d6a565bb9a7fcb74c4775e7e7a5c7 not found: ID does not exist" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.160357 4803 scope.go:117] "RemoveContainer" containerID="ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795" Mar 20 17:37:40 crc kubenswrapper[4803]: E0320 17:37:40.160702 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795\": container with ID starting with ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795 not found: ID does not exist" containerID="ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.160739 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795"} err="failed to get container status \"ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795\": rpc error: code = NotFound desc = could not find container \"ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795\": container with ID starting with ef1b016649ace54f4a0d050e39edfe88eb4cfc05fb34e42a51763209480f7795 not found: ID does not exist" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.435359 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.466675 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.476366 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.487789 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.495727 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:40 crc kubenswrapper[4803]: E0320 17:37:40.496229 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-metadata" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.496248 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-metadata" Mar 20 17:37:40 crc kubenswrapper[4803]: E0320 17:37:40.496275 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-log" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.496284 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-log" Mar 20 17:37:40 crc kubenswrapper[4803]: E0320 17:37:40.496301 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-api" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.496310 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-api" Mar 20 17:37:40 crc kubenswrapper[4803]: E0320 17:37:40.496348 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-log" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.496357 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-log" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.496615 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-metadata" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.496634 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-api" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.496660 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" containerName="nova-api-log" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.496690 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" containerName="nova-metadata-log" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.497929 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.502345 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.502505 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.502597 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.505600 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.507227 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.508847 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.509012 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.510095 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.517090 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.756709 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b56a359b-06fc-40e2-b128-c2427461160a-logs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.756790 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.756827 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.756895 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp5xm\" (UniqueName: \"kubernetes.io/projected/b56a359b-06fc-40e2-b128-c2427461160a-kube-api-access-dp5xm\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.756981 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.757002 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9461d82d-3e47-4dea-889c-a82c7f4a97b4-logs\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.757035 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.757059 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-config-data\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.757096 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-public-tls-certs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.757125 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gl8l\" (UniqueName: \"kubernetes.io/projected/9461d82d-3e47-4dea-889c-a82c7f4a97b4-kube-api-access-8gl8l\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.757155 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-config-data\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858226 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b56a359b-06fc-40e2-b128-c2427461160a-logs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858272 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858298 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858324 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee" path="/var/lib/kubelet/pods/0252ff4d-1b4f-4d2d-ab7f-fd765d89b9ee/volumes" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858332 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp5xm\" (UniqueName: \"kubernetes.io/projected/b56a359b-06fc-40e2-b128-c2427461160a-kube-api-access-dp5xm\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858558 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858586 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9461d82d-3e47-4dea-889c-a82c7f4a97b4-logs\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858627 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858653 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-config-data\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858690 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-public-tls-certs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858703 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b56a359b-06fc-40e2-b128-c2427461160a-logs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858722 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gl8l\" (UniqueName: \"kubernetes.io/projected/9461d82d-3e47-4dea-889c-a82c7f4a97b4-kube-api-access-8gl8l\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.858751 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-config-data\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.859014 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9461d82d-3e47-4dea-889c-a82c7f4a97b4-logs\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.859295 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9dc8572-a2f9-4940-a959-9f90656358b2" path="/var/lib/kubelet/pods/b9dc8572-a2f9-4940-a959-9f90656358b2/volumes" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.863925 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.864389 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-config-data\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.865248 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.865608 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.865724 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9461d82d-3e47-4dea-889c-a82c7f4a97b4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.865831 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-public-tls-certs\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.865958 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b56a359b-06fc-40e2-b128-c2427461160a-config-data\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.876545 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp5xm\" (UniqueName: \"kubernetes.io/projected/b56a359b-06fc-40e2-b128-c2427461160a-kube-api-access-dp5xm\") pod \"nova-api-0\" (UID: \"b56a359b-06fc-40e2-b128-c2427461160a\") " pod="openstack/nova-api-0" Mar 20 17:37:40 crc kubenswrapper[4803]: I0320 17:37:40.877916 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gl8l\" (UniqueName: \"kubernetes.io/projected/9461d82d-3e47-4dea-889c-a82c7f4a97b4-kube-api-access-8gl8l\") pod \"nova-metadata-0\" (UID: \"9461d82d-3e47-4dea-889c-a82c7f4a97b4\") " pod="openstack/nova-metadata-0" Mar 20 17:37:41 crc kubenswrapper[4803]: I0320 17:37:41.141380 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 17:37:41 crc kubenswrapper[4803]: I0320 17:37:41.148573 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 17:37:41 crc kubenswrapper[4803]: I0320 17:37:41.676611 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 17:37:41 crc kubenswrapper[4803]: I0320 17:37:41.761355 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 17:37:41 crc kubenswrapper[4803]: W0320 17:37:41.774770 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb56a359b_06fc_40e2_b128_c2427461160a.slice/crio-56a5673b706a6b3c6bb20d2565c6862c00fd9c05112d0423b88ff71e5e046cf4 WatchSource:0}: Error finding container 56a5673b706a6b3c6bb20d2565c6862c00fd9c05112d0423b88ff71e5e046cf4: Status 404 returned error can't find the container with id 56a5673b706a6b3c6bb20d2565c6862c00fd9c05112d0423b88ff71e5e046cf4 Mar 20 17:37:42 crc kubenswrapper[4803]: I0320 17:37:42.121149 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9461d82d-3e47-4dea-889c-a82c7f4a97b4","Type":"ContainerStarted","Data":"9996574f658f1f37345cab50c0c2ddc49e700577413c70a2ad7987d92ab8ffdb"} Mar 20 17:37:42 crc kubenswrapper[4803]: I0320 17:37:42.121427 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9461d82d-3e47-4dea-889c-a82c7f4a97b4","Type":"ContainerStarted","Data":"6fb4c303f4a2eef0806c3d1cb45bd2e3a053e1862b23654f13417362694263b1"} Mar 20 17:37:42 crc kubenswrapper[4803]: I0320 17:37:42.123956 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b56a359b-06fc-40e2-b128-c2427461160a","Type":"ContainerStarted","Data":"c628a95bf8d4bc35ddfa632f2fa650a088536ad69dda05420bd28c74d35af9d5"} Mar 20 17:37:42 crc kubenswrapper[4803]: I0320 17:37:42.123984 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b56a359b-06fc-40e2-b128-c2427461160a","Type":"ContainerStarted","Data":"56a5673b706a6b3c6bb20d2565c6862c00fd9c05112d0423b88ff71e5e046cf4"} Mar 20 17:37:43 crc kubenswrapper[4803]: I0320 17:37:43.134554 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9461d82d-3e47-4dea-889c-a82c7f4a97b4","Type":"ContainerStarted","Data":"f15cf2cdf1a47213b8db430f17b036eba110a0bb67198e7fedb1472a033387c4"} Mar 20 17:37:43 crc kubenswrapper[4803]: I0320 17:37:43.138022 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b56a359b-06fc-40e2-b128-c2427461160a","Type":"ContainerStarted","Data":"09cce353ddd07a2e94ae877957cbd1ef296323d84eac2df740525377fd3310c9"} Mar 20 17:37:43 crc kubenswrapper[4803]: I0320 17:37:43.169833 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.169790719 podStartE2EDuration="3.169790719s" podCreationTimestamp="2026-03-20 17:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:43.158032412 +0000 UTC m=+1273.069624492" watchObservedRunningTime="2026-03-20 17:37:43.169790719 +0000 UTC m=+1273.081382799" Mar 20 17:37:43 crc kubenswrapper[4803]: I0320 17:37:43.180468 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.180269751 podStartE2EDuration="3.180269751s" podCreationTimestamp="2026-03-20 17:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:37:43.174829835 +0000 UTC m=+1273.086421915" watchObservedRunningTime="2026-03-20 17:37:43.180269751 +0000 UTC m=+1273.091861821" Mar 20 17:37:43 crc kubenswrapper[4803]: I0320 17:37:43.457656 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 17:37:48 crc kubenswrapper[4803]: I0320 17:37:48.457896 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 17:37:48 crc kubenswrapper[4803]: I0320 17:37:48.490210 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 17:37:49 crc kubenswrapper[4803]: I0320 17:37:49.248585 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 17:37:50 crc kubenswrapper[4803]: I0320 17:37:50.288545 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 17:37:51 crc kubenswrapper[4803]: I0320 17:37:51.142855 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:37:51 crc kubenswrapper[4803]: I0320 17:37:51.142916 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 17:37:51 crc kubenswrapper[4803]: I0320 17:37:51.149776 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:37:51 crc kubenswrapper[4803]: I0320 17:37:51.149836 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 17:37:52 crc kubenswrapper[4803]: I0320 17:37:52.153720 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b56a359b-06fc-40e2-b128-c2427461160a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:52 crc kubenswrapper[4803]: I0320 17:37:52.153757 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b56a359b-06fc-40e2-b128-c2427461160a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:52 crc kubenswrapper[4803]: I0320 17:37:52.168891 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9461d82d-3e47-4dea-889c-a82c7f4a97b4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:52 crc kubenswrapper[4803]: I0320 17:37:52.168894 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9461d82d-3e47-4dea-889c-a82c7f4a97b4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.213:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 17:37:59 crc kubenswrapper[4803]: I0320 17:37:59.142072 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:37:59 crc kubenswrapper[4803]: I0320 17:37:59.142991 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 17:37:59 crc kubenswrapper[4803]: I0320 17:37:59.149666 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:37:59 crc kubenswrapper[4803]: I0320 17:37:59.149751 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.171829 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567138-qz74x"] Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.173248 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-qz74x" Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.180449 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.180748 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.181272 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.184871 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-qz74x"] Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.265053 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhrgq\" (UniqueName: \"kubernetes.io/projected/d9c6fd02-f9d2-4a83-8042-5bbf9c583f51-kube-api-access-lhrgq\") pod \"auto-csr-approver-29567138-qz74x\" (UID: \"d9c6fd02-f9d2-4a83-8042-5bbf9c583f51\") " pod="openshift-infra/auto-csr-approver-29567138-qz74x" Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.366949 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhrgq\" (UniqueName: \"kubernetes.io/projected/d9c6fd02-f9d2-4a83-8042-5bbf9c583f51-kube-api-access-lhrgq\") pod \"auto-csr-approver-29567138-qz74x\" (UID: \"d9c6fd02-f9d2-4a83-8042-5bbf9c583f51\") " pod="openshift-infra/auto-csr-approver-29567138-qz74x" Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.397957 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhrgq\" (UniqueName: \"kubernetes.io/projected/d9c6fd02-f9d2-4a83-8042-5bbf9c583f51-kube-api-access-lhrgq\") pod \"auto-csr-approver-29567138-qz74x\" (UID: \"d9c6fd02-f9d2-4a83-8042-5bbf9c583f51\") " pod="openshift-infra/auto-csr-approver-29567138-qz74x" Mar 20 17:38:00 crc kubenswrapper[4803]: I0320 17:38:00.543050 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-qz74x" Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.013026 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-qz74x"] Mar 20 17:38:01 crc kubenswrapper[4803]: W0320 17:38:01.019968 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9c6fd02_f9d2_4a83_8042_5bbf9c583f51.slice/crio-f1699e921e478e850657e42dae98dc0ab3ae0fe9601752ec965c5a308a5a4fba WatchSource:0}: Error finding container f1699e921e478e850657e42dae98dc0ab3ae0fe9601752ec965c5a308a5a4fba: Status 404 returned error can't find the container with id f1699e921e478e850657e42dae98dc0ab3ae0fe9601752ec965c5a308a5a4fba Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.148299 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.154803 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.154943 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.156003 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.156779 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.162672 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.324761 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-qz74x" event={"ID":"d9c6fd02-f9d2-4a83-8042-5bbf9c583f51","Type":"ContainerStarted","Data":"f1699e921e478e850657e42dae98dc0ab3ae0fe9601752ec965c5a308a5a4fba"} Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.330303 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 17:38:01 crc kubenswrapper[4803]: I0320 17:38:01.335378 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 17:38:05 crc kubenswrapper[4803]: I0320 17:38:05.409395 4803 generic.go:334] "Generic (PLEG): container finished" podID="d9c6fd02-f9d2-4a83-8042-5bbf9c583f51" containerID="e453ee4325cb778a15e6f9d4625eb9c198bef9e3322946ecea40b0215628f412" exitCode=0 Mar 20 17:38:05 crc kubenswrapper[4803]: I0320 17:38:05.409516 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-qz74x" event={"ID":"d9c6fd02-f9d2-4a83-8042-5bbf9c583f51","Type":"ContainerDied","Data":"e453ee4325cb778a15e6f9d4625eb9c198bef9e3322946ecea40b0215628f412"} Mar 20 17:38:06 crc kubenswrapper[4803]: I0320 17:38:06.888769 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-qz74x" Mar 20 17:38:07 crc kubenswrapper[4803]: I0320 17:38:07.068137 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhrgq\" (UniqueName: \"kubernetes.io/projected/d9c6fd02-f9d2-4a83-8042-5bbf9c583f51-kube-api-access-lhrgq\") pod \"d9c6fd02-f9d2-4a83-8042-5bbf9c583f51\" (UID: \"d9c6fd02-f9d2-4a83-8042-5bbf9c583f51\") " Mar 20 17:38:07 crc kubenswrapper[4803]: I0320 17:38:07.076843 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c6fd02-f9d2-4a83-8042-5bbf9c583f51-kube-api-access-lhrgq" (OuterVolumeSpecName: "kube-api-access-lhrgq") pod "d9c6fd02-f9d2-4a83-8042-5bbf9c583f51" (UID: "d9c6fd02-f9d2-4a83-8042-5bbf9c583f51"). InnerVolumeSpecName "kube-api-access-lhrgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:07 crc kubenswrapper[4803]: I0320 17:38:07.170083 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhrgq\" (UniqueName: \"kubernetes.io/projected/d9c6fd02-f9d2-4a83-8042-5bbf9c583f51-kube-api-access-lhrgq\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:07 crc kubenswrapper[4803]: I0320 17:38:07.442403 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567138-qz74x" event={"ID":"d9c6fd02-f9d2-4a83-8042-5bbf9c583f51","Type":"ContainerDied","Data":"f1699e921e478e850657e42dae98dc0ab3ae0fe9601752ec965c5a308a5a4fba"} Mar 20 17:38:07 crc kubenswrapper[4803]: I0320 17:38:07.442885 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1699e921e478e850657e42dae98dc0ab3ae0fe9601752ec965c5a308a5a4fba" Mar 20 17:38:07 crc kubenswrapper[4803]: I0320 17:38:07.442463 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567138-qz74x" Mar 20 17:38:07 crc kubenswrapper[4803]: I0320 17:38:07.967264 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-9dbds"] Mar 20 17:38:07 crc kubenswrapper[4803]: I0320 17:38:07.978258 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567132-9dbds"] Mar 20 17:38:08 crc kubenswrapper[4803]: I0320 17:38:08.861696 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caee5b1b-0dda-4525-bf7c-ac589fb5f730" path="/var/lib/kubelet/pods/caee5b1b-0dda-4525-bf7c-ac589fb5f730/volumes" Mar 20 17:38:10 crc kubenswrapper[4803]: I0320 17:38:10.002512 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:38:11 crc kubenswrapper[4803]: I0320 17:38:11.497921 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:38:13 crc kubenswrapper[4803]: I0320 17:38:13.044000 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="rabbitmq" containerID="cri-o://26089048cbbc614440decf4c08aa5e302585d08f70067e4479dbdf5e2efb25cb" gracePeriod=57 Mar 20 17:38:14 crc kubenswrapper[4803]: I0320 17:38:14.186302 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" containerName="rabbitmq" containerID="cri-o://3a6c86e5744baf1b54c823d9ba18a6394488ed3c40a8de67d330499157e57291" gracePeriod=58 Mar 20 17:38:15 crc kubenswrapper[4803]: I0320 17:38:15.526592 4803 generic.go:334] "Generic (PLEG): container finished" podID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" containerID="3a6c86e5744baf1b54c823d9ba18a6394488ed3c40a8de67d330499157e57291" exitCode=0 Mar 20 17:38:15 crc kubenswrapper[4803]: I0320 17:38:15.526693 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b4b88aa-f18b-40c9-b8ad-dbd3739565da","Type":"ContainerDied","Data":"3a6c86e5744baf1b54c823d9ba18a6394488ed3c40a8de67d330499157e57291"} Mar 20 17:38:15 crc kubenswrapper[4803]: I0320 17:38:15.946153 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.142551 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-config-data\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.142627 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.142690 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-plugins\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.142720 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-pod-info\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.142755 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9w4d\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-kube-api-access-s9w4d\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.142814 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-erlang-cookie\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.142844 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-confd\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.142966 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-plugins-conf\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.143015 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-erlang-cookie-secret\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.143068 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-server-conf\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.143129 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-tls\") pod \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\" (UID: \"5b4b88aa-f18b-40c9-b8ad-dbd3739565da\") " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.143514 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.143745 4803 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.145089 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.146687 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.151293 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.151332 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-kube-api-access-s9w4d" (OuterVolumeSpecName: "kube-api-access-s9w4d") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "kube-api-access-s9w4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.151637 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.159742 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.173427 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-pod-info" (OuterVolumeSpecName: "pod-info") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.189444 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-config-data" (OuterVolumeSpecName: "config-data") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.205694 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-server-conf" (OuterVolumeSpecName: "server-conf") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245380 4803 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245419 4803 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245438 4803 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245452 4803 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245467 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245500 4803 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245516 4803 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245593 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9w4d\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-kube-api-access-s9w4d\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.245612 4803 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.273700 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "5b4b88aa-f18b-40c9-b8ad-dbd3739565da" (UID: "5b4b88aa-f18b-40c9-b8ad-dbd3739565da"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.275896 4803 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.346954 4803 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/5b4b88aa-f18b-40c9-b8ad-dbd3739565da-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.346981 4803 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.536052 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"5b4b88aa-f18b-40c9-b8ad-dbd3739565da","Type":"ContainerDied","Data":"a9f8181a3c133fcf816ab8ce69e6138c45c81660b198e72764725377ab38fc99"} Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.536304 4803 scope.go:117] "RemoveContainer" containerID="3a6c86e5744baf1b54c823d9ba18a6394488ed3c40a8de67d330499157e57291" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.536152 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.579684 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.579879 4803 scope.go:117] "RemoveContainer" containerID="cb59d50529a5a90c57fcd7f2353539848ddae2913f0390264bfdfe1eccbb70f0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.614847 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.651948 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:38:16 crc kubenswrapper[4803]: E0320 17:38:16.652395 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" containerName="setup-container" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.652408 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" containerName="setup-container" Mar 20 17:38:16 crc kubenswrapper[4803]: E0320 17:38:16.652422 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" containerName="rabbitmq" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.652428 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" containerName="rabbitmq" Mar 20 17:38:16 crc kubenswrapper[4803]: E0320 17:38:16.652442 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c6fd02-f9d2-4a83-8042-5bbf9c583f51" containerName="oc" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.652448 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c6fd02-f9d2-4a83-8042-5bbf9c583f51" containerName="oc" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.652734 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" containerName="rabbitmq" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.652759 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c6fd02-f9d2-4a83-8042-5bbf9c583f51" containerName="oc" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.654138 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.656016 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.656702 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.657112 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.657232 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-4kqgt" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.657344 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.657490 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.657592 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.666643 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.755383 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.755452 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.755472 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6450d307-a4cf-4d3c-acdd-31a50aec6109-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.755819 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.755935 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.756059 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6450d307-a4cf-4d3c-acdd-31a50aec6109-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.756195 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.756275 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.756371 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.756463 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.756597 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z49nl\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-kube-api-access-z49nl\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857564 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857618 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857637 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6450d307-a4cf-4d3c-acdd-31a50aec6109-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857665 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857684 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857714 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6450d307-a4cf-4d3c-acdd-31a50aec6109-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857739 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857763 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857787 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857810 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.857839 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z49nl\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-kube-api-access-z49nl\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.858215 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.858619 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.858808 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.858952 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.859058 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.859277 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6450d307-a4cf-4d3c-acdd-31a50aec6109-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.862023 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6450d307-a4cf-4d3c-acdd-31a50aec6109-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.862310 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b4b88aa-f18b-40c9-b8ad-dbd3739565da" path="/var/lib/kubelet/pods/5b4b88aa-f18b-40c9-b8ad-dbd3739565da/volumes" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.863468 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.864037 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.871025 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6450d307-a4cf-4d3c-acdd-31a50aec6109-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.874617 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z49nl\" (UniqueName: \"kubernetes.io/projected/6450d307-a4cf-4d3c-acdd-31a50aec6109-kube-api-access-z49nl\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.901899 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6450d307-a4cf-4d3c-acdd-31a50aec6109\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:16 crc kubenswrapper[4803]: I0320 17:38:16.972117 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:38:17 crc kubenswrapper[4803]: I0320 17:38:17.481711 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 17:38:17 crc kubenswrapper[4803]: I0320 17:38:17.547683 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6450d307-a4cf-4d3c-acdd-31a50aec6109","Type":"ContainerStarted","Data":"8c8828c6119bdb855e471b645ca3cbcbf320bf5e03750cd6bafb6dcba45f1742"} Mar 20 17:38:18 crc kubenswrapper[4803]: I0320 17:38:18.430449 4803 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.579143 4803 generic.go:334] "Generic (PLEG): container finished" podID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerID="26089048cbbc614440decf4c08aa5e302585d08f70067e4479dbdf5e2efb25cb" exitCode=0 Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.579647 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b537f5f-54f2-4c70-be7b-4c57f84c572c","Type":"ContainerDied","Data":"26089048cbbc614440decf4c08aa5e302585d08f70067e4479dbdf5e2efb25cb"} Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.581229 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6450d307-a4cf-4d3c-acdd-31a50aec6109","Type":"ContainerStarted","Data":"108e4f9536d078a4342cc5e6ffb9a05a7f5d377b009d5d7e7806e30b9268b1d4"} Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.804485 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922032 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922075 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b537f5f-54f2-4c70-be7b-4c57f84c572c-pod-info\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922119 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-plugins\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922138 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b537f5f-54f2-4c70-be7b-4c57f84c572c-erlang-cookie-secret\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922160 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-config-data\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922238 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxgml\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-kube-api-access-bxgml\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922286 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-confd\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922387 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-plugins-conf\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922405 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-erlang-cookie\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922450 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-tls\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.922464 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-server-conf\") pod \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\" (UID: \"4b537f5f-54f2-4c70-be7b-4c57f84c572c\") " Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.924269 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.924726 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.927247 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.927457 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.930800 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4b537f5f-54f2-4c70-be7b-4c57f84c572c-pod-info" (OuterVolumeSpecName: "pod-info") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.930946 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.934143 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b537f5f-54f2-4c70-be7b-4c57f84c572c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.935187 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-kube-api-access-bxgml" (OuterVolumeSpecName: "kube-api-access-bxgml") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "kube-api-access-bxgml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.961146 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-server-conf" (OuterVolumeSpecName: "server-conf") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:19 crc kubenswrapper[4803]: I0320 17:38:19.969186 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-config-data" (OuterVolumeSpecName: "config-data") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.024333 4803 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025471 4803 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b537f5f-54f2-4c70-be7b-4c57f84c572c-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025564 4803 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025619 4803 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b537f5f-54f2-4c70-be7b-4c57f84c572c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025678 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025740 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxgml\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-kube-api-access-bxgml\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025792 4803 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025853 4803 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025921 4803 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.025971 4803 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b537f5f-54f2-4c70-be7b-4c57f84c572c-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.040189 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4b537f5f-54f2-4c70-be7b-4c57f84c572c" (UID: "4b537f5f-54f2-4c70-be7b-4c57f84c572c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.045538 4803 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.127673 4803 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b537f5f-54f2-4c70-be7b-4c57f84c572c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.127718 4803 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.168597 4803 scope.go:117] "RemoveContainer" containerID="56b5f2c3a0afdb80fd2c803f9cb50e733f01d14610e6d0680e39257648da9d04" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.242642 4803 scope.go:117] "RemoveContainer" containerID="0b2767e577cb938fc13be147b170a48f9578281d7709f1398cef293ec32e2198" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.275960 4803 scope.go:117] "RemoveContainer" containerID="e6f39a5c444cbca9a1c7b39a853c9b42162eda29c5ea9af6b63ba72cea71f6a9" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.302385 4803 scope.go:117] "RemoveContainer" containerID="c530a880676c11cd3a8e68b60a1e97720c979aa9175dd420fdb7b46224ab8f4b" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.593875 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b537f5f-54f2-4c70-be7b-4c57f84c572c","Type":"ContainerDied","Data":"74d965a7cf07649bab222faee00aedfa14447bb2ced0ee5316e7ff0dfb875c3d"} Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.593953 4803 scope.go:117] "RemoveContainer" containerID="26089048cbbc614440decf4c08aa5e302585d08f70067e4479dbdf5e2efb25cb" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.594702 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.636043 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.646048 4803 scope.go:117] "RemoveContainer" containerID="052d5539bc83f751f8ac5bfbf90e0df4d4248a9132e6e3ff8ad18e6caaa138b0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.649205 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.665988 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:38:20 crc kubenswrapper[4803]: E0320 17:38:20.666440 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="setup-container" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.666462 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="setup-container" Mar 20 17:38:20 crc kubenswrapper[4803]: E0320 17:38:20.666485 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="rabbitmq" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.666494 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="rabbitmq" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.666864 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" containerName="rabbitmq" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.668069 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.671898 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.672114 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.672163 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-rc669" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.672420 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.672427 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.672577 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.672710 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.693560 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840200 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840545 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45lxv\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-kube-api-access-45lxv\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840583 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72227685-667c-47b8-aedb-0329dd683bc0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840626 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840710 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840734 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840751 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840785 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-config-data\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840868 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840884 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.840905 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72227685-667c-47b8-aedb-0329dd683bc0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.861712 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b537f5f-54f2-4c70-be7b-4c57f84c572c" path="/var/lib/kubelet/pods/4b537f5f-54f2-4c70-be7b-4c57f84c572c/volumes" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.942264 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72227685-667c-47b8-aedb-0329dd683bc0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.942358 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.942420 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45lxv\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-kube-api-access-45lxv\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.942465 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72227685-667c-47b8-aedb-0329dd683bc0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.942602 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.942681 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.942699 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.942972 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.943006 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.943092 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-config-data\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.943285 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.943306 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.943709 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.944289 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.944291 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.944486 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-config-data\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.944500 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72227685-667c-47b8-aedb-0329dd683bc0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.948240 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72227685-667c-47b8-aedb-0329dd683bc0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.948351 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72227685-667c-47b8-aedb-0329dd683bc0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.948586 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.953693 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.963481 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45lxv\" (UniqueName: \"kubernetes.io/projected/72227685-667c-47b8-aedb-0329dd683bc0-kube-api-access-45lxv\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:20 crc kubenswrapper[4803]: I0320 17:38:20.995239 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"72227685-667c-47b8-aedb-0329dd683bc0\") " pod="openstack/rabbitmq-server-0" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.047572 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.459500 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-b6xn4"] Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.465768 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.467588 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.482188 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-b6xn4"] Mar 20 17:38:21 crc kubenswrapper[4803]: W0320 17:38:21.540284 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72227685_667c_47b8_aedb_0329dd683bc0.slice/crio-9c1af26dbabcd2fdccaeecf270a74d4d02d8b505545b6e338008d1f6ec0d9a70 WatchSource:0}: Error finding container 9c1af26dbabcd2fdccaeecf270a74d4d02d8b505545b6e338008d1f6ec0d9a70: Status 404 returned error can't find the container with id 9c1af26dbabcd2fdccaeecf270a74d4d02d8b505545b6e338008d1f6ec0d9a70 Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.549050 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.621035 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72227685-667c-47b8-aedb-0329dd683bc0","Type":"ContainerStarted","Data":"9c1af26dbabcd2fdccaeecf270a74d4d02d8b505545b6e338008d1f6ec0d9a70"} Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.656920 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.657017 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq4bh\" (UniqueName: \"kubernetes.io/projected/73307826-c6c4-42c6-94f6-ff05f23cacfe-kube-api-access-mq4bh\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.657059 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.657923 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.658084 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-config\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.658140 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.658210 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-svc\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.693417 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-b6xn4"] Mar 20 17:38:21 crc kubenswrapper[4803]: E0320 17:38:21.694329 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-mq4bh openstack-edpm-ipam ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-d558885bc-b6xn4" podUID="73307826-c6c4-42c6-94f6-ff05f23cacfe" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.714891 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-hh5vh"] Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.716410 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.734641 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-hh5vh"] Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.760088 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.760169 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq4bh\" (UniqueName: \"kubernetes.io/projected/73307826-c6c4-42c6-94f6-ff05f23cacfe-kube-api-access-mq4bh\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.760200 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.760230 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.760279 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-config\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.760304 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.760333 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-svc\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.761083 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-svc\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.761934 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.762698 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-config\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.762839 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.762908 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.763407 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.793676 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq4bh\" (UniqueName: \"kubernetes.io/projected/73307826-c6c4-42c6-94f6-ff05f23cacfe-kube-api-access-mq4bh\") pod \"dnsmasq-dns-d558885bc-b6xn4\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.863205 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.863285 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.863435 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-config\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.863503 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.863618 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.863758 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7bpw\" (UniqueName: \"kubernetes.io/projected/6516522c-430f-476f-8471-d5b39263571f-kube-api-access-g7bpw\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.863871 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.965372 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.965446 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.965506 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-config\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.965561 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.965614 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.965645 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7bpw\" (UniqueName: \"kubernetes.io/projected/6516522c-430f-476f-8471-d5b39263571f-kube-api-access-g7bpw\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.965680 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.966367 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.966480 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.966569 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.966575 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-config\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.966981 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.967031 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6516522c-430f-476f-8471-d5b39263571f-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:21 crc kubenswrapper[4803]: I0320 17:38:21.988353 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7bpw\" (UniqueName: \"kubernetes.io/projected/6516522c-430f-476f-8471-d5b39263571f-kube-api-access-g7bpw\") pod \"dnsmasq-dns-78c64bc9c5-hh5vh\" (UID: \"6516522c-430f-476f-8471-d5b39263571f\") " pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:22 crc kubenswrapper[4803]: I0320 17:38:22.037458 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:22 crc kubenswrapper[4803]: I0320 17:38:22.548617 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-hh5vh"] Mar 20 17:38:22 crc kubenswrapper[4803]: W0320 17:38:22.578504 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6516522c_430f_476f_8471_d5b39263571f.slice/crio-026ad1be3e6f942809dae267110fcf2a64a359a880ecd3a8f639818a4e9f3ac8 WatchSource:0}: Error finding container 026ad1be3e6f942809dae267110fcf2a64a359a880ecd3a8f639818a4e9f3ac8: Status 404 returned error can't find the container with id 026ad1be3e6f942809dae267110fcf2a64a359a880ecd3a8f639818a4e9f3ac8 Mar 20 17:38:22 crc kubenswrapper[4803]: I0320 17:38:22.630779 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" event={"ID":"6516522c-430f-476f-8471-d5b39263571f","Type":"ContainerStarted","Data":"026ad1be3e6f942809dae267110fcf2a64a359a880ecd3a8f639818a4e9f3ac8"} Mar 20 17:38:22 crc kubenswrapper[4803]: I0320 17:38:22.630814 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.127041 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.291079 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-openstack-edpm-ipam\") pod \"73307826-c6c4-42c6-94f6-ff05f23cacfe\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.291483 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-swift-storage-0\") pod \"73307826-c6c4-42c6-94f6-ff05f23cacfe\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.291652 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "73307826-c6c4-42c6-94f6-ff05f23cacfe" (UID: "73307826-c6c4-42c6-94f6-ff05f23cacfe"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.291719 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-svc\") pod \"73307826-c6c4-42c6-94f6-ff05f23cacfe\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.291824 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq4bh\" (UniqueName: \"kubernetes.io/projected/73307826-c6c4-42c6-94f6-ff05f23cacfe-kube-api-access-mq4bh\") pod \"73307826-c6c4-42c6-94f6-ff05f23cacfe\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.291875 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "73307826-c6c4-42c6-94f6-ff05f23cacfe" (UID: "73307826-c6c4-42c6-94f6-ff05f23cacfe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.291975 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-nb\") pod \"73307826-c6c4-42c6-94f6-ff05f23cacfe\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.292001 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-config\") pod \"73307826-c6c4-42c6-94f6-ff05f23cacfe\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.292083 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-sb\") pod \"73307826-c6c4-42c6-94f6-ff05f23cacfe\" (UID: \"73307826-c6c4-42c6-94f6-ff05f23cacfe\") " Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.292670 4803 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.292692 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.292090 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "73307826-c6c4-42c6-94f6-ff05f23cacfe" (UID: "73307826-c6c4-42c6-94f6-ff05f23cacfe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.292286 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "73307826-c6c4-42c6-94f6-ff05f23cacfe" (UID: "73307826-c6c4-42c6-94f6-ff05f23cacfe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.293060 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "73307826-c6c4-42c6-94f6-ff05f23cacfe" (UID: "73307826-c6c4-42c6-94f6-ff05f23cacfe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.293158 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-config" (OuterVolumeSpecName: "config") pod "73307826-c6c4-42c6-94f6-ff05f23cacfe" (UID: "73307826-c6c4-42c6-94f6-ff05f23cacfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.298687 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73307826-c6c4-42c6-94f6-ff05f23cacfe-kube-api-access-mq4bh" (OuterVolumeSpecName: "kube-api-access-mq4bh") pod "73307826-c6c4-42c6-94f6-ff05f23cacfe" (UID: "73307826-c6c4-42c6-94f6-ff05f23cacfe"). InnerVolumeSpecName "kube-api-access-mq4bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.393551 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.393582 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq4bh\" (UniqueName: \"kubernetes.io/projected/73307826-c6c4-42c6-94f6-ff05f23cacfe-kube-api-access-mq4bh\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.393592 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.393601 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.393609 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/73307826-c6c4-42c6-94f6-ff05f23cacfe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.642026 4803 generic.go:334] "Generic (PLEG): container finished" podID="6516522c-430f-476f-8471-d5b39263571f" containerID="c57e70aa19a0840eef5ab92a1e3f80d00b1ca12f8d10e91708fad91ee1f4ecbd" exitCode=0 Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.642126 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" event={"ID":"6516522c-430f-476f-8471-d5b39263571f","Type":"ContainerDied","Data":"c57e70aa19a0840eef5ab92a1e3f80d00b1ca12f8d10e91708fad91ee1f4ecbd"} Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.649869 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-b6xn4" Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.650031 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72227685-667c-47b8-aedb-0329dd683bc0","Type":"ContainerStarted","Data":"59277e2bb3ec371e619f0be24e3ed0143aa44bd0c8847d392aa91b9ff9de790b"} Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.877425 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-b6xn4"] Mar 20 17:38:23 crc kubenswrapper[4803]: I0320 17:38:23.882328 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-b6xn4"] Mar 20 17:38:24 crc kubenswrapper[4803]: I0320 17:38:24.663689 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" event={"ID":"6516522c-430f-476f-8471-d5b39263571f","Type":"ContainerStarted","Data":"8b515174316f1a72c2c413c512cdf1fb7f11ab60f7bfbc11e3180b81fbdf450b"} Mar 20 17:38:24 crc kubenswrapper[4803]: I0320 17:38:24.696895 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" podStartSLOduration=3.696864241 podStartE2EDuration="3.696864241s" podCreationTimestamp="2026-03-20 17:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:38:24.688954476 +0000 UTC m=+1314.600546646" watchObservedRunningTime="2026-03-20 17:38:24.696864241 +0000 UTC m=+1314.608456341" Mar 20 17:38:24 crc kubenswrapper[4803]: I0320 17:38:24.862189 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73307826-c6c4-42c6-94f6-ff05f23cacfe" path="/var/lib/kubelet/pods/73307826-c6c4-42c6-94f6-ff05f23cacfe/volumes" Mar 20 17:38:25 crc kubenswrapper[4803]: I0320 17:38:25.676921 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:31 crc kubenswrapper[4803]: I0320 17:38:31.859273 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c2dz8"] Mar 20 17:38:31 crc kubenswrapper[4803]: I0320 17:38:31.861699 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:31 crc kubenswrapper[4803]: I0320 17:38:31.880572 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2dz8"] Mar 20 17:38:31 crc kubenswrapper[4803]: I0320 17:38:31.974643 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lw8f\" (UniqueName: \"kubernetes.io/projected/222ebbc2-d868-492d-9922-89a1b009e676-kube-api-access-9lw8f\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:31 crc kubenswrapper[4803]: I0320 17:38:31.974770 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-utilities\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:31 crc kubenswrapper[4803]: I0320 17:38:31.974842 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-catalog-content\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.039674 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-hh5vh" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.077315 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lw8f\" (UniqueName: \"kubernetes.io/projected/222ebbc2-d868-492d-9922-89a1b009e676-kube-api-access-9lw8f\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.077449 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-utilities\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.077567 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-catalog-content\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.078021 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-utilities\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.078144 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-catalog-content\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.120456 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mxk8r"] Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.121002 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" podUID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" containerName="dnsmasq-dns" containerID="cri-o://246a4b0f715b2b9961f880a5b5939f4b4026cb26304f6dc56601a263a444b0a3" gracePeriod=10 Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.132488 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lw8f\" (UniqueName: \"kubernetes.io/projected/222ebbc2-d868-492d-9922-89a1b009e676-kube-api-access-9lw8f\") pod \"redhat-operators-c2dz8\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.228072 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.749402 4803 generic.go:334] "Generic (PLEG): container finished" podID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" containerID="246a4b0f715b2b9961f880a5b5939f4b4026cb26304f6dc56601a263a444b0a3" exitCode=0 Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.749633 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" event={"ID":"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0","Type":"ContainerDied","Data":"246a4b0f715b2b9961f880a5b5939f4b4026cb26304f6dc56601a263a444b0a3"} Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.749741 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" event={"ID":"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0","Type":"ContainerDied","Data":"c1393bac5be2b0f622a4ba716ffc58c78a440dbd0bdac02e80e5f7f446566d36"} Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.749757 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1393bac5be2b0f622a4ba716ffc58c78a440dbd0bdac02e80e5f7f446566d36" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.781017 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2dz8"] Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.856271 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.992917 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-swift-storage-0\") pod \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.993005 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-config\") pod \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.993048 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psbfw\" (UniqueName: \"kubernetes.io/projected/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-kube-api-access-psbfw\") pod \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.993143 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-sb\") pod \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.993220 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-svc\") pod \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " Mar 20 17:38:32 crc kubenswrapper[4803]: I0320 17:38:32.993247 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-nb\") pod \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\" (UID: \"1a290344-3b36-4e1c-a94e-f9c31b7b8fe0\") " Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:32.999184 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-kube-api-access-psbfw" (OuterVolumeSpecName: "kube-api-access-psbfw") pod "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" (UID: "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0"). InnerVolumeSpecName "kube-api-access-psbfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.041635 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-config" (OuterVolumeSpecName: "config") pod "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" (UID: "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.043211 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" (UID: "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.057299 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" (UID: "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.060084 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" (UID: "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.078211 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" (UID: "1a290344-3b36-4e1c-a94e-f9c31b7b8fe0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.095274 4803 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-config\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.095313 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psbfw\" (UniqueName: \"kubernetes.io/projected/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-kube-api-access-psbfw\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.095323 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.095332 4803 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.095343 4803 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.095354 4803 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.757830 4803 generic.go:334] "Generic (PLEG): container finished" podID="222ebbc2-d868-492d-9922-89a1b009e676" containerID="43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938" exitCode=0 Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.758125 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mxk8r" Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.758603 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2dz8" event={"ID":"222ebbc2-d868-492d-9922-89a1b009e676","Type":"ContainerDied","Data":"43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938"} Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.758663 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2dz8" event={"ID":"222ebbc2-d868-492d-9922-89a1b009e676","Type":"ContainerStarted","Data":"46017309543af4c0d634a67209ad9e18188be60c19ae0bbc6f813bd3fa3f1a62"} Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.805389 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mxk8r"] Mar 20 17:38:33 crc kubenswrapper[4803]: I0320 17:38:33.813908 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mxk8r"] Mar 20 17:38:34 crc kubenswrapper[4803]: I0320 17:38:34.865196 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" path="/var/lib/kubelet/pods/1a290344-3b36-4e1c-a94e-f9c31b7b8fe0/volumes" Mar 20 17:38:36 crc kubenswrapper[4803]: I0320 17:38:36.788853 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2dz8" event={"ID":"222ebbc2-d868-492d-9922-89a1b009e676","Type":"ContainerStarted","Data":"e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294"} Mar 20 17:38:37 crc kubenswrapper[4803]: I0320 17:38:37.832264 4803 generic.go:334] "Generic (PLEG): container finished" podID="222ebbc2-d868-492d-9922-89a1b009e676" containerID="e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294" exitCode=0 Mar 20 17:38:37 crc kubenswrapper[4803]: I0320 17:38:37.832368 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2dz8" event={"ID":"222ebbc2-d868-492d-9922-89a1b009e676","Type":"ContainerDied","Data":"e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294"} Mar 20 17:38:38 crc kubenswrapper[4803]: I0320 17:38:38.246253 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:38:38 crc kubenswrapper[4803]: I0320 17:38:38.246322 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.720183 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5"] Mar 20 17:38:40 crc kubenswrapper[4803]: E0320 17:38:40.721104 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" containerName="init" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.721123 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" containerName="init" Mar 20 17:38:40 crc kubenswrapper[4803]: E0320 17:38:40.721191 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" containerName="dnsmasq-dns" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.721206 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" containerName="dnsmasq-dns" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.721507 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a290344-3b36-4e1c-a94e-f9c31b7b8fe0" containerName="dnsmasq-dns" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.722423 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.727487 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.728063 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.728653 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.729088 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.731905 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5"] Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.786409 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.786712 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.786844 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.786959 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt88h\" (UniqueName: \"kubernetes.io/projected/be466e4c-5034-4784-9c39-a390b28adb2e-kube-api-access-wt88h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.888446 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt88h\" (UniqueName: \"kubernetes.io/projected/be466e4c-5034-4784-9c39-a390b28adb2e-kube-api-access-wt88h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.888949 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.888998 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.889022 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.894494 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.895145 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.897120 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:40 crc kubenswrapper[4803]: I0320 17:38:40.903174 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt88h\" (UniqueName: \"kubernetes.io/projected/be466e4c-5034-4784-9c39-a390b28adb2e-kube-api-access-wt88h\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:41 crc kubenswrapper[4803]: I0320 17:38:41.093921 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:38:42 crc kubenswrapper[4803]: W0320 17:38:42.007042 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe466e4c_5034_4784_9c39_a390b28adb2e.slice/crio-b70c3a9b8d522135120d21d9d6380807458b3a55cc94f0dd21077f97e47f9a76 WatchSource:0}: Error finding container b70c3a9b8d522135120d21d9d6380807458b3a55cc94f0dd21077f97e47f9a76: Status 404 returned error can't find the container with id b70c3a9b8d522135120d21d9d6380807458b3a55cc94f0dd21077f97e47f9a76 Mar 20 17:38:42 crc kubenswrapper[4803]: I0320 17:38:42.009054 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5"] Mar 20 17:38:42 crc kubenswrapper[4803]: I0320 17:38:42.915511 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" event={"ID":"be466e4c-5034-4784-9c39-a390b28adb2e","Type":"ContainerStarted","Data":"b70c3a9b8d522135120d21d9d6380807458b3a55cc94f0dd21077f97e47f9a76"} Mar 20 17:38:45 crc kubenswrapper[4803]: I0320 17:38:45.946716 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2dz8" event={"ID":"222ebbc2-d868-492d-9922-89a1b009e676","Type":"ContainerStarted","Data":"1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c"} Mar 20 17:38:45 crc kubenswrapper[4803]: I0320 17:38:45.971731 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c2dz8" podStartSLOduration=3.887583785 podStartE2EDuration="14.971704258s" podCreationTimestamp="2026-03-20 17:38:31 +0000 UTC" firstStartedPulling="2026-03-20 17:38:33.759836348 +0000 UTC m=+1323.671428408" lastFinishedPulling="2026-03-20 17:38:44.843956801 +0000 UTC m=+1334.755548881" observedRunningTime="2026-03-20 17:38:45.965259585 +0000 UTC m=+1335.876851675" watchObservedRunningTime="2026-03-20 17:38:45.971704258 +0000 UTC m=+1335.883296328" Mar 20 17:38:51 crc kubenswrapper[4803]: I0320 17:38:51.077676 4803 generic.go:334] "Generic (PLEG): container finished" podID="6450d307-a4cf-4d3c-acdd-31a50aec6109" containerID="108e4f9536d078a4342cc5e6ffb9a05a7f5d377b009d5d7e7806e30b9268b1d4" exitCode=0 Mar 20 17:38:51 crc kubenswrapper[4803]: I0320 17:38:51.077895 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6450d307-a4cf-4d3c-acdd-31a50aec6109","Type":"ContainerDied","Data":"108e4f9536d078a4342cc5e6ffb9a05a7f5d377b009d5d7e7806e30b9268b1d4"} Mar 20 17:38:52 crc kubenswrapper[4803]: I0320 17:38:52.228898 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:52 crc kubenswrapper[4803]: I0320 17:38:52.229217 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:38:53 crc kubenswrapper[4803]: I0320 17:38:53.303176 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c2dz8" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="registry-server" probeResult="failure" output=< Mar 20 17:38:53 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:38:53 crc kubenswrapper[4803]: > Mar 20 17:38:55 crc kubenswrapper[4803]: I0320 17:38:55.114726 4803 generic.go:334] "Generic (PLEG): container finished" podID="72227685-667c-47b8-aedb-0329dd683bc0" containerID="59277e2bb3ec371e619f0be24e3ed0143aa44bd0c8847d392aa91b9ff9de790b" exitCode=0 Mar 20 17:38:55 crc kubenswrapper[4803]: I0320 17:38:55.114791 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72227685-667c-47b8-aedb-0329dd683bc0","Type":"ContainerDied","Data":"59277e2bb3ec371e619f0be24e3ed0143aa44bd0c8847d392aa91b9ff9de790b"} Mar 20 17:39:00 crc kubenswrapper[4803]: I0320 17:39:00.185775 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72227685-667c-47b8-aedb-0329dd683bc0","Type":"ContainerStarted","Data":"29e3acedbd828e23f1e3968bc2555156f2395292d67c89b809ccca55d0ac9679"} Mar 20 17:39:00 crc kubenswrapper[4803]: I0320 17:39:00.186680 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 17:39:00 crc kubenswrapper[4803]: I0320 17:39:00.190570 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6450d307-a4cf-4d3c-acdd-31a50aec6109","Type":"ContainerStarted","Data":"1c8e0ba6f7daaf565a91d86c25ee66302f94451f9aa9950e659edc26fca19855"} Mar 20 17:39:00 crc kubenswrapper[4803]: I0320 17:39:00.190880 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:39:00 crc kubenswrapper[4803]: I0320 17:39:00.192947 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" event={"ID":"be466e4c-5034-4784-9c39-a390b28adb2e","Type":"ContainerStarted","Data":"20fced93e3f00000df12d43f13acd9466a572a0dbd45a5afeeca3f45f00ce3d9"} Mar 20 17:39:00 crc kubenswrapper[4803]: I0320 17:39:00.215798 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.215782665 podStartE2EDuration="40.215782665s" podCreationTimestamp="2026-03-20 17:38:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:00.214444457 +0000 UTC m=+1350.126036537" watchObservedRunningTime="2026-03-20 17:39:00.215782665 +0000 UTC m=+1350.127374735" Mar 20 17:39:00 crc kubenswrapper[4803]: I0320 17:39:00.252455 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" podStartSLOduration=3.002446809 podStartE2EDuration="20.252433756s" podCreationTimestamp="2026-03-20 17:38:40 +0000 UTC" firstStartedPulling="2026-03-20 17:38:42.009328147 +0000 UTC m=+1331.920920217" lastFinishedPulling="2026-03-20 17:38:59.259315094 +0000 UTC m=+1349.170907164" observedRunningTime="2026-03-20 17:39:00.238203652 +0000 UTC m=+1350.149795752" watchObservedRunningTime="2026-03-20 17:39:00.252433756 +0000 UTC m=+1350.164025826" Mar 20 17:39:00 crc kubenswrapper[4803]: I0320 17:39:00.268969 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.268951656 podStartE2EDuration="44.268951656s" podCreationTimestamp="2026-03-20 17:38:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 17:39:00.266563948 +0000 UTC m=+1350.178156038" watchObservedRunningTime="2026-03-20 17:39:00.268951656 +0000 UTC m=+1350.180543726" Mar 20 17:39:03 crc kubenswrapper[4803]: I0320 17:39:03.283966 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c2dz8" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="registry-server" probeResult="failure" output=< Mar 20 17:39:03 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:39:03 crc kubenswrapper[4803]: > Mar 20 17:39:08 crc kubenswrapper[4803]: I0320 17:39:08.246276 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:39:08 crc kubenswrapper[4803]: I0320 17:39:08.246809 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:39:10 crc kubenswrapper[4803]: I0320 17:39:10.283363 4803 generic.go:334] "Generic (PLEG): container finished" podID="be466e4c-5034-4784-9c39-a390b28adb2e" containerID="20fced93e3f00000df12d43f13acd9466a572a0dbd45a5afeeca3f45f00ce3d9" exitCode=0 Mar 20 17:39:10 crc kubenswrapper[4803]: I0320 17:39:10.283424 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" event={"ID":"be466e4c-5034-4784-9c39-a390b28adb2e","Type":"ContainerDied","Data":"20fced93e3f00000df12d43f13acd9466a572a0dbd45a5afeeca3f45f00ce3d9"} Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.051750 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.730817 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.876740 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-inventory\") pod \"be466e4c-5034-4784-9c39-a390b28adb2e\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.876833 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt88h\" (UniqueName: \"kubernetes.io/projected/be466e4c-5034-4784-9c39-a390b28adb2e-kube-api-access-wt88h\") pod \"be466e4c-5034-4784-9c39-a390b28adb2e\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.877067 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-repo-setup-combined-ca-bundle\") pod \"be466e4c-5034-4784-9c39-a390b28adb2e\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.877090 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-ssh-key-openstack-edpm-ipam\") pod \"be466e4c-5034-4784-9c39-a390b28adb2e\" (UID: \"be466e4c-5034-4784-9c39-a390b28adb2e\") " Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.882836 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be466e4c-5034-4784-9c39-a390b28adb2e-kube-api-access-wt88h" (OuterVolumeSpecName: "kube-api-access-wt88h") pod "be466e4c-5034-4784-9c39-a390b28adb2e" (UID: "be466e4c-5034-4784-9c39-a390b28adb2e"). InnerVolumeSpecName "kube-api-access-wt88h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.882942 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "be466e4c-5034-4784-9c39-a390b28adb2e" (UID: "be466e4c-5034-4784-9c39-a390b28adb2e"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.909669 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-inventory" (OuterVolumeSpecName: "inventory") pod "be466e4c-5034-4784-9c39-a390b28adb2e" (UID: "be466e4c-5034-4784-9c39-a390b28adb2e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.912464 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "be466e4c-5034-4784-9c39-a390b28adb2e" (UID: "be466e4c-5034-4784-9c39-a390b28adb2e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.980323 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.980372 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt88h\" (UniqueName: \"kubernetes.io/projected/be466e4c-5034-4784-9c39-a390b28adb2e-kube-api-access-wt88h\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.980403 4803 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:11 crc kubenswrapper[4803]: I0320 17:39:11.980413 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be466e4c-5034-4784-9c39-a390b28adb2e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.314637 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" event={"ID":"be466e4c-5034-4784-9c39-a390b28adb2e","Type":"ContainerDied","Data":"b70c3a9b8d522135120d21d9d6380807458b3a55cc94f0dd21077f97e47f9a76"} Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.314680 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b70c3a9b8d522135120d21d9d6380807458b3a55cc94f0dd21077f97e47f9a76" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.314806 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.337085 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.417130 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.430723 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4"] Mar 20 17:39:12 crc kubenswrapper[4803]: E0320 17:39:12.431314 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be466e4c-5034-4784-9c39-a390b28adb2e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.431378 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="be466e4c-5034-4784-9c39-a390b28adb2e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.431625 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="be466e4c-5034-4784-9c39-a390b28adb2e" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.432267 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.437876 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.437896 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.438043 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.438377 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.446593 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4"] Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.504757 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86l5v\" (UniqueName: \"kubernetes.io/projected/5daa4111-ac59-4390-b425-d56ac571c768-kube-api-access-86l5v\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.505147 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.505942 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.606866 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.607226 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.607460 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86l5v\" (UniqueName: \"kubernetes.io/projected/5daa4111-ac59-4390-b425-d56ac571c768-kube-api-access-86l5v\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.611889 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.618039 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.630300 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86l5v\" (UniqueName: \"kubernetes.io/projected/5daa4111-ac59-4390-b425-d56ac571c768-kube-api-access-86l5v\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-qtns4\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:12 crc kubenswrapper[4803]: I0320 17:39:12.764864 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:13 crc kubenswrapper[4803]: I0320 17:39:13.167621 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2dz8"] Mar 20 17:39:13 crc kubenswrapper[4803]: I0320 17:39:13.457005 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4"] Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.334408 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" event={"ID":"5daa4111-ac59-4390-b425-d56ac571c768","Type":"ContainerStarted","Data":"74a2ce8a742aeb11dbcdb1bbf0fa6b9597f2f122aa65f53c0a676505db4a0aa3"} Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.335141 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" event={"ID":"5daa4111-ac59-4390-b425-d56ac571c768","Type":"ContainerStarted","Data":"ab910a731b8f2ac12785906835165ee2ff07308163fa64bcb66e326d46aac164"} Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.334687 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c2dz8" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="registry-server" containerID="cri-o://1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c" gracePeriod=2 Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.367419 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" podStartSLOduration=1.844592896 podStartE2EDuration="2.367386087s" podCreationTimestamp="2026-03-20 17:39:12 +0000 UTC" firstStartedPulling="2026-03-20 17:39:13.462143591 +0000 UTC m=+1363.373735661" lastFinishedPulling="2026-03-20 17:39:13.984936772 +0000 UTC m=+1363.896528852" observedRunningTime="2026-03-20 17:39:14.357658801 +0000 UTC m=+1364.269250891" watchObservedRunningTime="2026-03-20 17:39:14.367386087 +0000 UTC m=+1364.278978197" Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.829597 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.882895 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-catalog-content\") pod \"222ebbc2-d868-492d-9922-89a1b009e676\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.883076 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-utilities\") pod \"222ebbc2-d868-492d-9922-89a1b009e676\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.883210 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lw8f\" (UniqueName: \"kubernetes.io/projected/222ebbc2-d868-492d-9922-89a1b009e676-kube-api-access-9lw8f\") pod \"222ebbc2-d868-492d-9922-89a1b009e676\" (UID: \"222ebbc2-d868-492d-9922-89a1b009e676\") " Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.884848 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-utilities" (OuterVolumeSpecName: "utilities") pod "222ebbc2-d868-492d-9922-89a1b009e676" (UID: "222ebbc2-d868-492d-9922-89a1b009e676"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.893686 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/222ebbc2-d868-492d-9922-89a1b009e676-kube-api-access-9lw8f" (OuterVolumeSpecName: "kube-api-access-9lw8f") pod "222ebbc2-d868-492d-9922-89a1b009e676" (UID: "222ebbc2-d868-492d-9922-89a1b009e676"). InnerVolumeSpecName "kube-api-access-9lw8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.986678 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:14 crc kubenswrapper[4803]: I0320 17:39:14.986734 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lw8f\" (UniqueName: \"kubernetes.io/projected/222ebbc2-d868-492d-9922-89a1b009e676-kube-api-access-9lw8f\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.032804 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "222ebbc2-d868-492d-9922-89a1b009e676" (UID: "222ebbc2-d868-492d-9922-89a1b009e676"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.089709 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/222ebbc2-d868-492d-9922-89a1b009e676-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.350360 4803 generic.go:334] "Generic (PLEG): container finished" podID="222ebbc2-d868-492d-9922-89a1b009e676" containerID="1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c" exitCode=0 Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.350430 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2dz8" event={"ID":"222ebbc2-d868-492d-9922-89a1b009e676","Type":"ContainerDied","Data":"1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c"} Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.350876 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2dz8" event={"ID":"222ebbc2-d868-492d-9922-89a1b009e676","Type":"ContainerDied","Data":"46017309543af4c0d634a67209ad9e18188be60c19ae0bbc6f813bd3fa3f1a62"} Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.350917 4803 scope.go:117] "RemoveContainer" containerID="1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.350499 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2dz8" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.398787 4803 scope.go:117] "RemoveContainer" containerID="e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.407031 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2dz8"] Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.422583 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c2dz8"] Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.443234 4803 scope.go:117] "RemoveContainer" containerID="43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.485162 4803 scope.go:117] "RemoveContainer" containerID="1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c" Mar 20 17:39:15 crc kubenswrapper[4803]: E0320 17:39:15.485740 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c\": container with ID starting with 1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c not found: ID does not exist" containerID="1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.485793 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c"} err="failed to get container status \"1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c\": rpc error: code = NotFound desc = could not find container \"1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c\": container with ID starting with 1ae9576007726680f71d5fc1e2a8dc5211d43dcd2e5f6bf1f99026ba9604508c not found: ID does not exist" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.485828 4803 scope.go:117] "RemoveContainer" containerID="e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294" Mar 20 17:39:15 crc kubenswrapper[4803]: E0320 17:39:15.487057 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294\": container with ID starting with e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294 not found: ID does not exist" containerID="e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.487117 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294"} err="failed to get container status \"e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294\": rpc error: code = NotFound desc = could not find container \"e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294\": container with ID starting with e8d680eccace1ec3e763dd4ea2d3ad1bfa2b57d576e51ef942d0d986f24ac294 not found: ID does not exist" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.487165 4803 scope.go:117] "RemoveContainer" containerID="43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938" Mar 20 17:39:15 crc kubenswrapper[4803]: E0320 17:39:15.487721 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938\": container with ID starting with 43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938 not found: ID does not exist" containerID="43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938" Mar 20 17:39:15 crc kubenswrapper[4803]: I0320 17:39:15.487742 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938"} err="failed to get container status \"43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938\": rpc error: code = NotFound desc = could not find container \"43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938\": container with ID starting with 43667a8a791aa4481051498177ab301d8b71bde72187bd0da1df21f4a8273938 not found: ID does not exist" Mar 20 17:39:16 crc kubenswrapper[4803]: I0320 17:39:16.861718 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="222ebbc2-d868-492d-9922-89a1b009e676" path="/var/lib/kubelet/pods/222ebbc2-d868-492d-9922-89a1b009e676/volumes" Mar 20 17:39:16 crc kubenswrapper[4803]: I0320 17:39:16.977865 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 17:39:18 crc kubenswrapper[4803]: I0320 17:39:18.507545 4803 generic.go:334] "Generic (PLEG): container finished" podID="5daa4111-ac59-4390-b425-d56ac571c768" containerID="74a2ce8a742aeb11dbcdb1bbf0fa6b9597f2f122aa65f53c0a676505db4a0aa3" exitCode=0 Mar 20 17:39:18 crc kubenswrapper[4803]: I0320 17:39:18.508265 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" event={"ID":"5daa4111-ac59-4390-b425-d56ac571c768","Type":"ContainerDied","Data":"74a2ce8a742aeb11dbcdb1bbf0fa6b9597f2f122aa65f53c0a676505db4a0aa3"} Mar 20 17:39:19 crc kubenswrapper[4803]: I0320 17:39:19.966893 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.071736 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-ssh-key-openstack-edpm-ipam\") pod \"5daa4111-ac59-4390-b425-d56ac571c768\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.071949 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-inventory\") pod \"5daa4111-ac59-4390-b425-d56ac571c768\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.072084 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86l5v\" (UniqueName: \"kubernetes.io/projected/5daa4111-ac59-4390-b425-d56ac571c768-kube-api-access-86l5v\") pod \"5daa4111-ac59-4390-b425-d56ac571c768\" (UID: \"5daa4111-ac59-4390-b425-d56ac571c768\") " Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.077163 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5daa4111-ac59-4390-b425-d56ac571c768-kube-api-access-86l5v" (OuterVolumeSpecName: "kube-api-access-86l5v") pod "5daa4111-ac59-4390-b425-d56ac571c768" (UID: "5daa4111-ac59-4390-b425-d56ac571c768"). InnerVolumeSpecName "kube-api-access-86l5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.097589 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-inventory" (OuterVolumeSpecName: "inventory") pod "5daa4111-ac59-4390-b425-d56ac571c768" (UID: "5daa4111-ac59-4390-b425-d56ac571c768"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.098394 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5daa4111-ac59-4390-b425-d56ac571c768" (UID: "5daa4111-ac59-4390-b425-d56ac571c768"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.174259 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.174319 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86l5v\" (UniqueName: \"kubernetes.io/projected/5daa4111-ac59-4390-b425-d56ac571c768-kube-api-access-86l5v\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.174334 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5daa4111-ac59-4390-b425-d56ac571c768-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.530726 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" event={"ID":"5daa4111-ac59-4390-b425-d56ac571c768","Type":"ContainerDied","Data":"ab910a731b8f2ac12785906835165ee2ff07308163fa64bcb66e326d46aac164"} Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.530788 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab910a731b8f2ac12785906835165ee2ff07308163fa64bcb66e326d46aac164" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.530861 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-qtns4" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.625049 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn"] Mar 20 17:39:20 crc kubenswrapper[4803]: E0320 17:39:20.625544 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="extract-utilities" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.625564 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="extract-utilities" Mar 20 17:39:20 crc kubenswrapper[4803]: E0320 17:39:20.625593 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5daa4111-ac59-4390-b425-d56ac571c768" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.625603 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5daa4111-ac59-4390-b425-d56ac571c768" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:39:20 crc kubenswrapper[4803]: E0320 17:39:20.625639 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="extract-content" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.625648 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="extract-content" Mar 20 17:39:20 crc kubenswrapper[4803]: E0320 17:39:20.625662 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="registry-server" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.625670 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="registry-server" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.625887 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5daa4111-ac59-4390-b425-d56ac571c768" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.625922 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="222ebbc2-d868-492d-9922-89a1b009e676" containerName="registry-server" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.626779 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.630974 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.632010 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.632155 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.632573 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.646366 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn"] Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.682843 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.682921 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.683006 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.683062 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db2dd\" (UniqueName: \"kubernetes.io/projected/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-kube-api-access-db2dd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.785465 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.785654 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.785746 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db2dd\" (UniqueName: \"kubernetes.io/projected/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-kube-api-access-db2dd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.785904 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.791261 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.792481 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.803138 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.822185 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db2dd\" (UniqueName: \"kubernetes.io/projected/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-kube-api-access-db2dd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:20 crc kubenswrapper[4803]: I0320 17:39:20.952351 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:39:21 crc kubenswrapper[4803]: I0320 17:39:21.545576 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn"] Mar 20 17:39:22 crc kubenswrapper[4803]: I0320 17:39:22.564558 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" event={"ID":"0c2a599d-17eb-4116-8b3e-a9adc8a7568b","Type":"ContainerStarted","Data":"d27fcb95cb79890747cb9888b65f974380d28260c485e438f3ea611aab810506"} Mar 20 17:39:22 crc kubenswrapper[4803]: I0320 17:39:22.564901 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" event={"ID":"0c2a599d-17eb-4116-8b3e-a9adc8a7568b","Type":"ContainerStarted","Data":"f60ae089f668e54bd76f467d98e842bb1907e1897550c6a9ada36fa7974b0f98"} Mar 20 17:39:22 crc kubenswrapper[4803]: I0320 17:39:22.580244 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" podStartSLOduration=2.11861934 podStartE2EDuration="2.580222023s" podCreationTimestamp="2026-03-20 17:39:20 +0000 UTC" firstStartedPulling="2026-03-20 17:39:21.553691282 +0000 UTC m=+1371.465283352" lastFinishedPulling="2026-03-20 17:39:22.015293935 +0000 UTC m=+1371.926886035" observedRunningTime="2026-03-20 17:39:22.579288337 +0000 UTC m=+1372.490880407" watchObservedRunningTime="2026-03-20 17:39:22.580222023 +0000 UTC m=+1372.491814113" Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.245571 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.246129 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.246188 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.247143 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ddde2c658fdb702fa12402952a89565e3db989d32a4cbf09d114ff157fa7417"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.247311 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://7ddde2c658fdb702fa12402952a89565e3db989d32a4cbf09d114ff157fa7417" gracePeriod=600 Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.766068 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="7ddde2c658fdb702fa12402952a89565e3db989d32a4cbf09d114ff157fa7417" exitCode=0 Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.766151 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"7ddde2c658fdb702fa12402952a89565e3db989d32a4cbf09d114ff157fa7417"} Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.766414 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a"} Mar 20 17:39:38 crc kubenswrapper[4803]: I0320 17:39:38.766449 4803 scope.go:117] "RemoveContainer" containerID="1bf505c950c915ca8f0c808968521bcabc9e302497ac83cfddf349e9c7d8bb55" Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.166114 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567140-9xqw6"] Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.168951 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-9xqw6" Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.172304 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.172512 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.172636 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.181192 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-9xqw6"] Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.262586 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2tnr\" (UniqueName: \"kubernetes.io/projected/8285ef9f-caaf-4e35-a37f-1dbe5914e739-kube-api-access-l2tnr\") pod \"auto-csr-approver-29567140-9xqw6\" (UID: \"8285ef9f-caaf-4e35-a37f-1dbe5914e739\") " pod="openshift-infra/auto-csr-approver-29567140-9xqw6" Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.364694 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2tnr\" (UniqueName: \"kubernetes.io/projected/8285ef9f-caaf-4e35-a37f-1dbe5914e739-kube-api-access-l2tnr\") pod \"auto-csr-approver-29567140-9xqw6\" (UID: \"8285ef9f-caaf-4e35-a37f-1dbe5914e739\") " pod="openshift-infra/auto-csr-approver-29567140-9xqw6" Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.399859 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2tnr\" (UniqueName: \"kubernetes.io/projected/8285ef9f-caaf-4e35-a37f-1dbe5914e739-kube-api-access-l2tnr\") pod \"auto-csr-approver-29567140-9xqw6\" (UID: \"8285ef9f-caaf-4e35-a37f-1dbe5914e739\") " pod="openshift-infra/auto-csr-approver-29567140-9xqw6" Mar 20 17:40:00 crc kubenswrapper[4803]: I0320 17:40:00.531215 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-9xqw6" Mar 20 17:40:01 crc kubenswrapper[4803]: I0320 17:40:01.068180 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-9xqw6"] Mar 20 17:40:01 crc kubenswrapper[4803]: W0320 17:40:01.069557 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8285ef9f_caaf_4e35_a37f_1dbe5914e739.slice/crio-2610c2f724367698a8ebde938ce7f3e076bac5f323d9d370442144fe5c237b3f WatchSource:0}: Error finding container 2610c2f724367698a8ebde938ce7f3e076bac5f323d9d370442144fe5c237b3f: Status 404 returned error can't find the container with id 2610c2f724367698a8ebde938ce7f3e076bac5f323d9d370442144fe5c237b3f Mar 20 17:40:01 crc kubenswrapper[4803]: I0320 17:40:01.074775 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:40:02 crc kubenswrapper[4803]: I0320 17:40:02.054493 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-9xqw6" event={"ID":"8285ef9f-caaf-4e35-a37f-1dbe5914e739","Type":"ContainerStarted","Data":"2610c2f724367698a8ebde938ce7f3e076bac5f323d9d370442144fe5c237b3f"} Mar 20 17:40:04 crc kubenswrapper[4803]: I0320 17:40:04.084403 4803 generic.go:334] "Generic (PLEG): container finished" podID="8285ef9f-caaf-4e35-a37f-1dbe5914e739" containerID="bcc0e9cab83123e1afbbb52dd23576fd843d14029b93754d0938f212da70f45e" exitCode=0 Mar 20 17:40:04 crc kubenswrapper[4803]: I0320 17:40:04.084502 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-9xqw6" event={"ID":"8285ef9f-caaf-4e35-a37f-1dbe5914e739","Type":"ContainerDied","Data":"bcc0e9cab83123e1afbbb52dd23576fd843d14029b93754d0938f212da70f45e"} Mar 20 17:40:05 crc kubenswrapper[4803]: I0320 17:40:05.400753 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-9xqw6" Mar 20 17:40:05 crc kubenswrapper[4803]: I0320 17:40:05.487803 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2tnr\" (UniqueName: \"kubernetes.io/projected/8285ef9f-caaf-4e35-a37f-1dbe5914e739-kube-api-access-l2tnr\") pod \"8285ef9f-caaf-4e35-a37f-1dbe5914e739\" (UID: \"8285ef9f-caaf-4e35-a37f-1dbe5914e739\") " Mar 20 17:40:05 crc kubenswrapper[4803]: I0320 17:40:05.496144 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8285ef9f-caaf-4e35-a37f-1dbe5914e739-kube-api-access-l2tnr" (OuterVolumeSpecName: "kube-api-access-l2tnr") pod "8285ef9f-caaf-4e35-a37f-1dbe5914e739" (UID: "8285ef9f-caaf-4e35-a37f-1dbe5914e739"). InnerVolumeSpecName "kube-api-access-l2tnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:40:05 crc kubenswrapper[4803]: I0320 17:40:05.590030 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2tnr\" (UniqueName: \"kubernetes.io/projected/8285ef9f-caaf-4e35-a37f-1dbe5914e739-kube-api-access-l2tnr\") on node \"crc\" DevicePath \"\"" Mar 20 17:40:06 crc kubenswrapper[4803]: I0320 17:40:06.109864 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567140-9xqw6" event={"ID":"8285ef9f-caaf-4e35-a37f-1dbe5914e739","Type":"ContainerDied","Data":"2610c2f724367698a8ebde938ce7f3e076bac5f323d9d370442144fe5c237b3f"} Mar 20 17:40:06 crc kubenswrapper[4803]: I0320 17:40:06.109970 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2610c2f724367698a8ebde938ce7f3e076bac5f323d9d370442144fe5c237b3f" Mar 20 17:40:06 crc kubenswrapper[4803]: I0320 17:40:06.110364 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567140-9xqw6" Mar 20 17:40:06 crc kubenswrapper[4803]: I0320 17:40:06.486283 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-l24k6"] Mar 20 17:40:06 crc kubenswrapper[4803]: I0320 17:40:06.495808 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567134-l24k6"] Mar 20 17:40:06 crc kubenswrapper[4803]: I0320 17:40:06.869824 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b7f54dd-e657-437a-8638-63c5a5cbc8c0" path="/var/lib/kubelet/pods/2b7f54dd-e657-437a-8638-63c5a5cbc8c0/volumes" Mar 20 17:40:20 crc kubenswrapper[4803]: I0320 17:40:20.526172 4803 scope.go:117] "RemoveContainer" containerID="7ce7f6efeed7010297f2b259cf0e439588c084dab1c38d97c122f4f2fa496303" Mar 20 17:40:20 crc kubenswrapper[4803]: I0320 17:40:20.580793 4803 scope.go:117] "RemoveContainer" containerID="dd85f501e501030ecbc8e3b2457ac08842ebe17167f2f2597a29001e781549c5" Mar 20 17:40:20 crc kubenswrapper[4803]: I0320 17:40:20.656989 4803 scope.go:117] "RemoveContainer" containerID="3702573a176569d3e3344cfbb185455ff929addee44889ff3fb569eadffcdde5" Mar 20 17:40:20 crc kubenswrapper[4803]: I0320 17:40:20.705684 4803 scope.go:117] "RemoveContainer" containerID="7721ae2915ccc174f5844f83fbe4d12b05cabfb5654cd7e3b701902b9652f8d7" Mar 20 17:41:20 crc kubenswrapper[4803]: I0320 17:41:20.870795 4803 scope.go:117] "RemoveContainer" containerID="118e9705aaa8a134f7b51ee635781e210861eb775bae53298f2aeeb57f80314c" Mar 20 17:41:20 crc kubenswrapper[4803]: I0320 17:41:20.923434 4803 scope.go:117] "RemoveContainer" containerID="90edf5a9dbd8c7f4398a1cedcc13d4bf83cd86e752363131cdb6db3760423011" Mar 20 17:41:20 crc kubenswrapper[4803]: I0320 17:41:20.983623 4803 scope.go:117] "RemoveContainer" containerID="5f8eee39cb1e2bae324df0b81bafc95c7eb474ef6d9cd9db257fe0d718615d13" Mar 20 17:41:38 crc kubenswrapper[4803]: I0320 17:41:38.245307 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:41:38 crc kubenswrapper[4803]: I0320 17:41:38.245850 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:41:53 crc kubenswrapper[4803]: I0320 17:41:53.959644 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2hjcd"] Mar 20 17:41:53 crc kubenswrapper[4803]: E0320 17:41:53.960696 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8285ef9f-caaf-4e35-a37f-1dbe5914e739" containerName="oc" Mar 20 17:41:53 crc kubenswrapper[4803]: I0320 17:41:53.960711 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="8285ef9f-caaf-4e35-a37f-1dbe5914e739" containerName="oc" Mar 20 17:41:53 crc kubenswrapper[4803]: I0320 17:41:53.960985 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="8285ef9f-caaf-4e35-a37f-1dbe5914e739" containerName="oc" Mar 20 17:41:53 crc kubenswrapper[4803]: I0320 17:41:53.963124 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:53 crc kubenswrapper[4803]: I0320 17:41:53.978484 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2hjcd"] Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.006251 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-catalog-content\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.006324 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f4sc\" (UniqueName: \"kubernetes.io/projected/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-kube-api-access-7f4sc\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.006458 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-utilities\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.108714 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-catalog-content\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.108775 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f4sc\" (UniqueName: \"kubernetes.io/projected/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-kube-api-access-7f4sc\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.108843 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-utilities\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.109504 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-catalog-content\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.109556 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-utilities\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.135065 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f4sc\" (UniqueName: \"kubernetes.io/projected/ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98-kube-api-access-7f4sc\") pod \"community-operators-2hjcd\" (UID: \"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98\") " pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.293375 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:41:54 crc kubenswrapper[4803]: I0320 17:41:54.868983 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2hjcd"] Mar 20 17:41:55 crc kubenswrapper[4803]: I0320 17:41:55.532009 4803 generic.go:334] "Generic (PLEG): container finished" podID="ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98" containerID="62f6c8bbe5bd1b37d738bcad3c66e5cbbe9278df3cb06702daa1b89c5bd2f41d" exitCode=0 Mar 20 17:41:55 crc kubenswrapper[4803]: I0320 17:41:55.532079 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjcd" event={"ID":"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98","Type":"ContainerDied","Data":"62f6c8bbe5bd1b37d738bcad3c66e5cbbe9278df3cb06702daa1b89c5bd2f41d"} Mar 20 17:41:55 crc kubenswrapper[4803]: I0320 17:41:55.532281 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjcd" event={"ID":"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98","Type":"ContainerStarted","Data":"3394972d028914d00e85478a1a860e349eb99e6a12eeff6981db06a1ced17e88"} Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.153867 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567142-wh7rr"] Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.155717 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.157866 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.158013 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.159646 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.176824 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-wh7rr"] Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.280371 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9k4v\" (UniqueName: \"kubernetes.io/projected/b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7-kube-api-access-r9k4v\") pod \"auto-csr-approver-29567142-wh7rr\" (UID: \"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7\") " pod="openshift-infra/auto-csr-approver-29567142-wh7rr" Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.382176 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9k4v\" (UniqueName: \"kubernetes.io/projected/b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7-kube-api-access-r9k4v\") pod \"auto-csr-approver-29567142-wh7rr\" (UID: \"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7\") " pod="openshift-infra/auto-csr-approver-29567142-wh7rr" Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.404587 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9k4v\" (UniqueName: \"kubernetes.io/projected/b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7-kube-api-access-r9k4v\") pod \"auto-csr-approver-29567142-wh7rr\" (UID: \"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7\") " pod="openshift-infra/auto-csr-approver-29567142-wh7rr" Mar 20 17:42:00 crc kubenswrapper[4803]: I0320 17:42:00.477208 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" Mar 20 17:42:01 crc kubenswrapper[4803]: I0320 17:42:01.153845 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-wh7rr"] Mar 20 17:42:01 crc kubenswrapper[4803]: I0320 17:42:01.631459 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" event={"ID":"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7","Type":"ContainerStarted","Data":"ed6d9a7cdab6b1967ce0aae42b9a148747194d814584defd5a1d4dffafadb5a5"} Mar 20 17:42:01 crc kubenswrapper[4803]: I0320 17:42:01.633969 4803 generic.go:334] "Generic (PLEG): container finished" podID="ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98" containerID="c4fce27c7885721951ef199150fe11724c3851c690eb55c24d8d574f2f285fcf" exitCode=0 Mar 20 17:42:01 crc kubenswrapper[4803]: I0320 17:42:01.634161 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjcd" event={"ID":"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98","Type":"ContainerDied","Data":"c4fce27c7885721951ef199150fe11724c3851c690eb55c24d8d574f2f285fcf"} Mar 20 17:42:03 crc kubenswrapper[4803]: I0320 17:42:03.665235 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hjcd" event={"ID":"ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98","Type":"ContainerStarted","Data":"0c98719d17fb1bce2d6e43b5d55a415f13fc22f057fbd0103916bcfff23ea334"} Mar 20 17:42:04 crc kubenswrapper[4803]: I0320 17:42:04.676289 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" event={"ID":"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7","Type":"ContainerStarted","Data":"c7f1a559893fc902708f548cb281df3a3e3d7b05d13531135b05238e6580e6b9"} Mar 20 17:42:04 crc kubenswrapper[4803]: I0320 17:42:04.700252 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2hjcd" podStartSLOduration=4.055165774 podStartE2EDuration="11.700228374s" podCreationTimestamp="2026-03-20 17:41:53 +0000 UTC" firstStartedPulling="2026-03-20 17:41:55.534565177 +0000 UTC m=+1525.446157247" lastFinishedPulling="2026-03-20 17:42:03.179627777 +0000 UTC m=+1533.091219847" observedRunningTime="2026-03-20 17:42:04.693068785 +0000 UTC m=+1534.604660885" watchObservedRunningTime="2026-03-20 17:42:04.700228374 +0000 UTC m=+1534.611820464" Mar 20 17:42:04 crc kubenswrapper[4803]: I0320 17:42:04.751282 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" podStartSLOduration=2.203627632 podStartE2EDuration="4.751260488s" podCreationTimestamp="2026-03-20 17:42:00 +0000 UTC" firstStartedPulling="2026-03-20 17:42:01.153648316 +0000 UTC m=+1531.065240396" lastFinishedPulling="2026-03-20 17:42:03.701281172 +0000 UTC m=+1533.612873252" observedRunningTime="2026-03-20 17:42:04.740402977 +0000 UTC m=+1534.651995057" watchObservedRunningTime="2026-03-20 17:42:04.751260488 +0000 UTC m=+1534.662852568" Mar 20 17:42:05 crc kubenswrapper[4803]: I0320 17:42:05.688027 4803 generic.go:334] "Generic (PLEG): container finished" podID="b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7" containerID="c7f1a559893fc902708f548cb281df3a3e3d7b05d13531135b05238e6580e6b9" exitCode=0 Mar 20 17:42:05 crc kubenswrapper[4803]: I0320 17:42:05.688075 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" event={"ID":"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7","Type":"ContainerDied","Data":"c7f1a559893fc902708f548cb281df3a3e3d7b05d13531135b05238e6580e6b9"} Mar 20 17:42:07 crc kubenswrapper[4803]: I0320 17:42:07.074278 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" Mar 20 17:42:07 crc kubenswrapper[4803]: I0320 17:42:07.180268 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9k4v\" (UniqueName: \"kubernetes.io/projected/b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7-kube-api-access-r9k4v\") pod \"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7\" (UID: \"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7\") " Mar 20 17:42:07 crc kubenswrapper[4803]: I0320 17:42:07.186461 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7-kube-api-access-r9k4v" (OuterVolumeSpecName: "kube-api-access-r9k4v") pod "b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7" (UID: "b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7"). InnerVolumeSpecName "kube-api-access-r9k4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:07 crc kubenswrapper[4803]: I0320 17:42:07.282519 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9k4v\" (UniqueName: \"kubernetes.io/projected/b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7-kube-api-access-r9k4v\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:07 crc kubenswrapper[4803]: I0320 17:42:07.703440 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" event={"ID":"b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7","Type":"ContainerDied","Data":"ed6d9a7cdab6b1967ce0aae42b9a148747194d814584defd5a1d4dffafadb5a5"} Mar 20 17:42:07 crc kubenswrapper[4803]: I0320 17:42:07.703476 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed6d9a7cdab6b1967ce0aae42b9a148747194d814584defd5a1d4dffafadb5a5" Mar 20 17:42:07 crc kubenswrapper[4803]: I0320 17:42:07.703538 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567142-wh7rr" Mar 20 17:42:08 crc kubenswrapper[4803]: I0320 17:42:08.161408 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-nkrhf"] Mar 20 17:42:08 crc kubenswrapper[4803]: I0320 17:42:08.175110 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567136-nkrhf"] Mar 20 17:42:08 crc kubenswrapper[4803]: I0320 17:42:08.246393 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:42:08 crc kubenswrapper[4803]: I0320 17:42:08.246461 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:42:08 crc kubenswrapper[4803]: I0320 17:42:08.858352 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af2e4734-6885-46ad-a064-324222e418cc" path="/var/lib/kubelet/pods/af2e4734-6885-46ad-a064-324222e418cc/volumes" Mar 20 17:42:14 crc kubenswrapper[4803]: I0320 17:42:14.293900 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:42:14 crc kubenswrapper[4803]: I0320 17:42:14.296091 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:42:14 crc kubenswrapper[4803]: I0320 17:42:14.391080 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:42:14 crc kubenswrapper[4803]: I0320 17:42:14.866876 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2hjcd" Mar 20 17:42:14 crc kubenswrapper[4803]: I0320 17:42:14.974804 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2hjcd"] Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.064608 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4m7j6"] Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.064872 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4m7j6" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerName="registry-server" containerID="cri-o://fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350" gracePeriod=2 Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.518357 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.670345 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-utilities\") pod \"ac03c75a-1844-4a22-9a24-4fa1720906be\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.670624 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94xg8\" (UniqueName: \"kubernetes.io/projected/ac03c75a-1844-4a22-9a24-4fa1720906be-kube-api-access-94xg8\") pod \"ac03c75a-1844-4a22-9a24-4fa1720906be\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.670678 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-catalog-content\") pod \"ac03c75a-1844-4a22-9a24-4fa1720906be\" (UID: \"ac03c75a-1844-4a22-9a24-4fa1720906be\") " Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.671030 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-utilities" (OuterVolumeSpecName: "utilities") pod "ac03c75a-1844-4a22-9a24-4fa1720906be" (UID: "ac03c75a-1844-4a22-9a24-4fa1720906be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.678913 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac03c75a-1844-4a22-9a24-4fa1720906be-kube-api-access-94xg8" (OuterVolumeSpecName: "kube-api-access-94xg8") pod "ac03c75a-1844-4a22-9a24-4fa1720906be" (UID: "ac03c75a-1844-4a22-9a24-4fa1720906be"). InnerVolumeSpecName "kube-api-access-94xg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.685636 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94xg8\" (UniqueName: \"kubernetes.io/projected/ac03c75a-1844-4a22-9a24-4fa1720906be-kube-api-access-94xg8\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.685673 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.728542 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac03c75a-1844-4a22-9a24-4fa1720906be" (UID: "ac03c75a-1844-4a22-9a24-4fa1720906be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.786787 4803 generic.go:334] "Generic (PLEG): container finished" podID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerID="fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350" exitCode=0 Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.786856 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4m7j6" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.786857 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7j6" event={"ID":"ac03c75a-1844-4a22-9a24-4fa1720906be","Type":"ContainerDied","Data":"fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350"} Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.786920 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4m7j6" event={"ID":"ac03c75a-1844-4a22-9a24-4fa1720906be","Type":"ContainerDied","Data":"38cf0412c8af63a63dda2198861754884a3fec269de85be635b838bc52218ae6"} Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.786940 4803 scope.go:117] "RemoveContainer" containerID="fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.789012 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac03c75a-1844-4a22-9a24-4fa1720906be-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.807668 4803 scope.go:117] "RemoveContainer" containerID="afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.822911 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4m7j6"] Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.838362 4803 scope.go:117] "RemoveContainer" containerID="8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.841591 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4m7j6"] Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.871645 4803 scope.go:117] "RemoveContainer" containerID="fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350" Mar 20 17:42:15 crc kubenswrapper[4803]: E0320 17:42:15.872111 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350\": container with ID starting with fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350 not found: ID does not exist" containerID="fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.872195 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350"} err="failed to get container status \"fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350\": rpc error: code = NotFound desc = could not find container \"fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350\": container with ID starting with fef2059eb599425fb9a7c0b5bb35daed6bde25a14836cc10134a73632ad8b350 not found: ID does not exist" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.872265 4803 scope.go:117] "RemoveContainer" containerID="afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964" Mar 20 17:42:15 crc kubenswrapper[4803]: E0320 17:42:15.872759 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964\": container with ID starting with afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964 not found: ID does not exist" containerID="afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.872843 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964"} err="failed to get container status \"afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964\": rpc error: code = NotFound desc = could not find container \"afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964\": container with ID starting with afff7a94935345bbec0062cdf13ffc3a0b81117015a94b7fde576fb3d292a964 not found: ID does not exist" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.872905 4803 scope.go:117] "RemoveContainer" containerID="8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8" Mar 20 17:42:15 crc kubenswrapper[4803]: E0320 17:42:15.873196 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8\": container with ID starting with 8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8 not found: ID does not exist" containerID="8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8" Mar 20 17:42:15 crc kubenswrapper[4803]: I0320 17:42:15.873274 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8"} err="failed to get container status \"8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8\": rpc error: code = NotFound desc = could not find container \"8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8\": container with ID starting with 8c9bbbca749070c006f0c321e2ac4f10cba46e76d6dc2700199ac7e552b3b3a8 not found: ID does not exist" Mar 20 17:42:16 crc kubenswrapper[4803]: I0320 17:42:16.860203 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" path="/var/lib/kubelet/pods/ac03c75a-1844-4a22-9a24-4fa1720906be/volumes" Mar 20 17:42:21 crc kubenswrapper[4803]: I0320 17:42:21.270825 4803 scope.go:117] "RemoveContainer" containerID="93fad68ac303ce1c7c559099d76de9a994a32245356543e5450b75e8788e0c50" Mar 20 17:42:37 crc kubenswrapper[4803]: I0320 17:42:37.019999 4803 generic.go:334] "Generic (PLEG): container finished" podID="0c2a599d-17eb-4116-8b3e-a9adc8a7568b" containerID="d27fcb95cb79890747cb9888b65f974380d28260c485e438f3ea611aab810506" exitCode=0 Mar 20 17:42:37 crc kubenswrapper[4803]: I0320 17:42:37.020096 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" event={"ID":"0c2a599d-17eb-4116-8b3e-a9adc8a7568b","Type":"ContainerDied","Data":"d27fcb95cb79890747cb9888b65f974380d28260c485e438f3ea611aab810506"} Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.250723 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.251086 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.251150 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.251959 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.252045 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" gracePeriod=600 Mar 20 17:42:38 crc kubenswrapper[4803]: E0320 17:42:38.387432 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.465041 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.593767 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db2dd\" (UniqueName: \"kubernetes.io/projected/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-kube-api-access-db2dd\") pod \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.593820 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-ssh-key-openstack-edpm-ipam\") pod \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.594011 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-inventory\") pod \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.594078 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-bootstrap-combined-ca-bundle\") pod \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\" (UID: \"0c2a599d-17eb-4116-8b3e-a9adc8a7568b\") " Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.599576 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-kube-api-access-db2dd" (OuterVolumeSpecName: "kube-api-access-db2dd") pod "0c2a599d-17eb-4116-8b3e-a9adc8a7568b" (UID: "0c2a599d-17eb-4116-8b3e-a9adc8a7568b"). InnerVolumeSpecName "kube-api-access-db2dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.600385 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0c2a599d-17eb-4116-8b3e-a9adc8a7568b" (UID: "0c2a599d-17eb-4116-8b3e-a9adc8a7568b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.639913 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-inventory" (OuterVolumeSpecName: "inventory") pod "0c2a599d-17eb-4116-8b3e-a9adc8a7568b" (UID: "0c2a599d-17eb-4116-8b3e-a9adc8a7568b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.644695 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0c2a599d-17eb-4116-8b3e-a9adc8a7568b" (UID: "0c2a599d-17eb-4116-8b3e-a9adc8a7568b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.696367 4803 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.696402 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db2dd\" (UniqueName: \"kubernetes.io/projected/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-kube-api-access-db2dd\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.696414 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:38 crc kubenswrapper[4803]: I0320 17:42:38.696423 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0c2a599d-17eb-4116-8b3e-a9adc8a7568b-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.043867 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.043951 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn" event={"ID":"0c2a599d-17eb-4116-8b3e-a9adc8a7568b","Type":"ContainerDied","Data":"f60ae089f668e54bd76f467d98e842bb1907e1897550c6a9ada36fa7974b0f98"} Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.044113 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60ae089f668e54bd76f467d98e842bb1907e1897550c6a9ada36fa7974b0f98" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.049970 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" exitCode=0 Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.050020 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a"} Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.050095 4803 scope.go:117] "RemoveContainer" containerID="7ddde2c658fdb702fa12402952a89565e3db989d32a4cbf09d114ff157fa7417" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.051187 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:42:39 crc kubenswrapper[4803]: E0320 17:42:39.051903 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185068 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn"] Mar 20 17:42:39 crc kubenswrapper[4803]: E0320 17:42:39.185593 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerName="extract-utilities" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185615 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerName="extract-utilities" Mar 20 17:42:39 crc kubenswrapper[4803]: E0320 17:42:39.185637 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2a599d-17eb-4116-8b3e-a9adc8a7568b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185648 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2a599d-17eb-4116-8b3e-a9adc8a7568b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:39 crc kubenswrapper[4803]: E0320 17:42:39.185670 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerName="registry-server" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185678 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerName="registry-server" Mar 20 17:42:39 crc kubenswrapper[4803]: E0320 17:42:39.185691 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7" containerName="oc" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185698 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7" containerName="oc" Mar 20 17:42:39 crc kubenswrapper[4803]: E0320 17:42:39.185719 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerName="extract-content" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185727 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerName="extract-content" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185942 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac03c75a-1844-4a22-9a24-4fa1720906be" containerName="registry-server" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185958 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2a599d-17eb-4116-8b3e-a9adc8a7568b" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.185995 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7" containerName="oc" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.186754 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.191640 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.192042 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.192229 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.192402 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.194938 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn"] Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.309154 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.309243 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.309422 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6znfb\" (UniqueName: \"kubernetes.io/projected/c2c45e4b-27e5-4614-875f-838444583617-kube-api-access-6znfb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.412258 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6znfb\" (UniqueName: \"kubernetes.io/projected/c2c45e4b-27e5-4614-875f-838444583617-kube-api-access-6znfb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.413057 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.413141 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.421455 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.426007 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.449022 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6znfb\" (UniqueName: \"kubernetes.io/projected/c2c45e4b-27e5-4614-875f-838444583617-kube-api-access-6znfb\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:39 crc kubenswrapper[4803]: I0320 17:42:39.507658 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:42:40 crc kubenswrapper[4803]: I0320 17:42:40.190291 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn"] Mar 20 17:42:40 crc kubenswrapper[4803]: W0320 17:42:40.202244 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2c45e4b_27e5_4614_875f_838444583617.slice/crio-0cecb3581a66907a8803773be4910abff5e60d1426f8da0f6a9c67fb8fe3beba WatchSource:0}: Error finding container 0cecb3581a66907a8803773be4910abff5e60d1426f8da0f6a9c67fb8fe3beba: Status 404 returned error can't find the container with id 0cecb3581a66907a8803773be4910abff5e60d1426f8da0f6a9c67fb8fe3beba Mar 20 17:42:41 crc kubenswrapper[4803]: I0320 17:42:41.078435 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" event={"ID":"c2c45e4b-27e5-4614-875f-838444583617","Type":"ContainerStarted","Data":"0cecb3581a66907a8803773be4910abff5e60d1426f8da0f6a9c67fb8fe3beba"} Mar 20 17:42:42 crc kubenswrapper[4803]: I0320 17:42:42.093609 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" event={"ID":"c2c45e4b-27e5-4614-875f-838444583617","Type":"ContainerStarted","Data":"c567efd90c79dafa7ea9860215626b074c8400f14589efd3f83c6b4a33c5e0ff"} Mar 20 17:42:42 crc kubenswrapper[4803]: I0320 17:42:42.125356 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" podStartSLOduration=2.264665014 podStartE2EDuration="3.125330153s" podCreationTimestamp="2026-03-20 17:42:39 +0000 UTC" firstStartedPulling="2026-03-20 17:42:40.206492002 +0000 UTC m=+1570.118084072" lastFinishedPulling="2026-03-20 17:42:41.067157131 +0000 UTC m=+1570.978749211" observedRunningTime="2026-03-20 17:42:42.116573011 +0000 UTC m=+1572.028165151" watchObservedRunningTime="2026-03-20 17:42:42.125330153 +0000 UTC m=+1572.036922263" Mar 20 17:42:49 crc kubenswrapper[4803]: I0320 17:42:49.879264 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:42:49 crc kubenswrapper[4803]: E0320 17:42:49.880127 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:43:00 crc kubenswrapper[4803]: I0320 17:43:00.859643 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:43:00 crc kubenswrapper[4803]: E0320 17:43:00.860681 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.135418 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qgglg"] Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.140012 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.159398 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgglg"] Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.234088 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-catalog-content\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.234137 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckgzm\" (UniqueName: \"kubernetes.io/projected/fd0628d1-60c8-45a5-a01e-01c6566e13a7-kube-api-access-ckgzm\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.234227 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-utilities\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.336322 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-catalog-content\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.336734 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckgzm\" (UniqueName: \"kubernetes.io/projected/fd0628d1-60c8-45a5-a01e-01c6566e13a7-kube-api-access-ckgzm\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.336884 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-utilities\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.336995 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-catalog-content\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.337500 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-utilities\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.363106 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckgzm\" (UniqueName: \"kubernetes.io/projected/fd0628d1-60c8-45a5-a01e-01c6566e13a7-kube-api-access-ckgzm\") pod \"redhat-marketplace-qgglg\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.473191 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:05 crc kubenswrapper[4803]: I0320 17:43:05.764412 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgglg"] Mar 20 17:43:06 crc kubenswrapper[4803]: I0320 17:43:06.358755 4803 generic.go:334] "Generic (PLEG): container finished" podID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerID="4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b" exitCode=0 Mar 20 17:43:06 crc kubenswrapper[4803]: I0320 17:43:06.358819 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgglg" event={"ID":"fd0628d1-60c8-45a5-a01e-01c6566e13a7","Type":"ContainerDied","Data":"4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b"} Mar 20 17:43:06 crc kubenswrapper[4803]: I0320 17:43:06.358895 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgglg" event={"ID":"fd0628d1-60c8-45a5-a01e-01c6566e13a7","Type":"ContainerStarted","Data":"67226b12098127e45ef3cfe91c71e43723a8d4997d1ac9426d5870c8a261a290"} Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.120648 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lqtw6"] Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.123310 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.136605 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqtw6"] Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.195850 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-catalog-content\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.196165 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/3b08e67e-e045-4c6e-8eba-69057b4512be-kube-api-access-xc6w9\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.196218 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-utilities\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.297545 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/3b08e67e-e045-4c6e-8eba-69057b4512be-kube-api-access-xc6w9\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.297595 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-utilities\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.298115 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-utilities\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.298158 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-catalog-content\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.298176 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-catalog-content\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.315972 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/3b08e67e-e045-4c6e-8eba-69057b4512be-kube-api-access-xc6w9\") pod \"certified-operators-lqtw6\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.377167 4803 generic.go:334] "Generic (PLEG): container finished" podID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerID="d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181" exitCode=0 Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.377214 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgglg" event={"ID":"fd0628d1-60c8-45a5-a01e-01c6566e13a7","Type":"ContainerDied","Data":"d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181"} Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.449207 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:08 crc kubenswrapper[4803]: I0320 17:43:08.943068 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lqtw6"] Mar 20 17:43:09 crc kubenswrapper[4803]: I0320 17:43:09.388776 4803 generic.go:334] "Generic (PLEG): container finished" podID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerID="b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce" exitCode=0 Mar 20 17:43:09 crc kubenswrapper[4803]: I0320 17:43:09.388851 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqtw6" event={"ID":"3b08e67e-e045-4c6e-8eba-69057b4512be","Type":"ContainerDied","Data":"b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce"} Mar 20 17:43:09 crc kubenswrapper[4803]: I0320 17:43:09.389780 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqtw6" event={"ID":"3b08e67e-e045-4c6e-8eba-69057b4512be","Type":"ContainerStarted","Data":"ffac6623176f14b0cde4131d217c578ff09ae033348e83207e18f8c68b3b848d"} Mar 20 17:43:09 crc kubenswrapper[4803]: I0320 17:43:09.393641 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgglg" event={"ID":"fd0628d1-60c8-45a5-a01e-01c6566e13a7","Type":"ContainerStarted","Data":"57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151"} Mar 20 17:43:09 crc kubenswrapper[4803]: I0320 17:43:09.445516 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qgglg" podStartSLOduration=1.97960337 podStartE2EDuration="4.445493822s" podCreationTimestamp="2026-03-20 17:43:05 +0000 UTC" firstStartedPulling="2026-03-20 17:43:06.361928813 +0000 UTC m=+1596.273520913" lastFinishedPulling="2026-03-20 17:43:08.827819295 +0000 UTC m=+1598.739411365" observedRunningTime="2026-03-20 17:43:09.44073612 +0000 UTC m=+1599.352328200" watchObservedRunningTime="2026-03-20 17:43:09.445493822 +0000 UTC m=+1599.357085922" Mar 20 17:43:10 crc kubenswrapper[4803]: I0320 17:43:10.404328 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqtw6" event={"ID":"3b08e67e-e045-4c6e-8eba-69057b4512be","Type":"ContainerStarted","Data":"cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38"} Mar 20 17:43:10 crc kubenswrapper[4803]: E0320 17:43:10.646977 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b08e67e_e045_4c6e_8eba_69057b4512be.slice/crio-cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38.scope\": RecentStats: unable to find data in memory cache]" Mar 20 17:43:12 crc kubenswrapper[4803]: I0320 17:43:12.432209 4803 generic.go:334] "Generic (PLEG): container finished" podID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerID="cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38" exitCode=0 Mar 20 17:43:12 crc kubenswrapper[4803]: I0320 17:43:12.432261 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqtw6" event={"ID":"3b08e67e-e045-4c6e-8eba-69057b4512be","Type":"ContainerDied","Data":"cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38"} Mar 20 17:43:12 crc kubenswrapper[4803]: I0320 17:43:12.849249 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:43:12 crc kubenswrapper[4803]: E0320 17:43:12.849440 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:43:13 crc kubenswrapper[4803]: I0320 17:43:13.447591 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqtw6" event={"ID":"3b08e67e-e045-4c6e-8eba-69057b4512be","Type":"ContainerStarted","Data":"6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2"} Mar 20 17:43:13 crc kubenswrapper[4803]: I0320 17:43:13.470732 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lqtw6" podStartSLOduration=2.010671672 podStartE2EDuration="5.470707642s" podCreationTimestamp="2026-03-20 17:43:08 +0000 UTC" firstStartedPulling="2026-03-20 17:43:09.390450086 +0000 UTC m=+1599.302042156" lastFinishedPulling="2026-03-20 17:43:12.850486056 +0000 UTC m=+1602.762078126" observedRunningTime="2026-03-20 17:43:13.4684899 +0000 UTC m=+1603.380081970" watchObservedRunningTime="2026-03-20 17:43:13.470707642 +0000 UTC m=+1603.382299742" Mar 20 17:43:15 crc kubenswrapper[4803]: I0320 17:43:15.473874 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:15 crc kubenswrapper[4803]: I0320 17:43:15.474223 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:15 crc kubenswrapper[4803]: I0320 17:43:15.540991 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:16 crc kubenswrapper[4803]: I0320 17:43:16.533716 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:16 crc kubenswrapper[4803]: I0320 17:43:16.693757 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgglg"] Mar 20 17:43:18 crc kubenswrapper[4803]: I0320 17:43:18.450307 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:18 crc kubenswrapper[4803]: I0320 17:43:18.450712 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:18 crc kubenswrapper[4803]: I0320 17:43:18.494798 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qgglg" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerName="registry-server" containerID="cri-o://57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151" gracePeriod=2 Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.036141 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.140323 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckgzm\" (UniqueName: \"kubernetes.io/projected/fd0628d1-60c8-45a5-a01e-01c6566e13a7-kube-api-access-ckgzm\") pod \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.140503 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-catalog-content\") pod \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.140539 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-utilities\") pod \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\" (UID: \"fd0628d1-60c8-45a5-a01e-01c6566e13a7\") " Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.141442 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-utilities" (OuterVolumeSpecName: "utilities") pod "fd0628d1-60c8-45a5-a01e-01c6566e13a7" (UID: "fd0628d1-60c8-45a5-a01e-01c6566e13a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.152761 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0628d1-60c8-45a5-a01e-01c6566e13a7-kube-api-access-ckgzm" (OuterVolumeSpecName: "kube-api-access-ckgzm") pod "fd0628d1-60c8-45a5-a01e-01c6566e13a7" (UID: "fd0628d1-60c8-45a5-a01e-01c6566e13a7"). InnerVolumeSpecName "kube-api-access-ckgzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.175105 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd0628d1-60c8-45a5-a01e-01c6566e13a7" (UID: "fd0628d1-60c8-45a5-a01e-01c6566e13a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.242572 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckgzm\" (UniqueName: \"kubernetes.io/projected/fd0628d1-60c8-45a5-a01e-01c6566e13a7-kube-api-access-ckgzm\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.242603 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.242613 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd0628d1-60c8-45a5-a01e-01c6566e13a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.504904 4803 generic.go:334] "Generic (PLEG): container finished" podID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerID="57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151" exitCode=0 Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.504956 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgglg" event={"ID":"fd0628d1-60c8-45a5-a01e-01c6566e13a7","Type":"ContainerDied","Data":"57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151"} Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.505015 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qgglg" event={"ID":"fd0628d1-60c8-45a5-a01e-01c6566e13a7","Type":"ContainerDied","Data":"67226b12098127e45ef3cfe91c71e43723a8d4997d1ac9426d5870c8a261a290"} Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.505038 4803 scope.go:117] "RemoveContainer" containerID="57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.506243 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qgglg" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.527924 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-lqtw6" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="registry-server" probeResult="failure" output=< Mar 20 17:43:19 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:43:19 crc kubenswrapper[4803]: > Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.536515 4803 scope.go:117] "RemoveContainer" containerID="d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.547168 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgglg"] Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.558173 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qgglg"] Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.572761 4803 scope.go:117] "RemoveContainer" containerID="4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.617372 4803 scope.go:117] "RemoveContainer" containerID="57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151" Mar 20 17:43:19 crc kubenswrapper[4803]: E0320 17:43:19.617813 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151\": container with ID starting with 57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151 not found: ID does not exist" containerID="57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.617859 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151"} err="failed to get container status \"57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151\": rpc error: code = NotFound desc = could not find container \"57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151\": container with ID starting with 57a4987a930c5ad879eaaab7f6ff8a498c6de63e97ff0832cdc6b6dfa7492151 not found: ID does not exist" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.617893 4803 scope.go:117] "RemoveContainer" containerID="d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181" Mar 20 17:43:19 crc kubenswrapper[4803]: E0320 17:43:19.618296 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181\": container with ID starting with d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181 not found: ID does not exist" containerID="d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.618326 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181"} err="failed to get container status \"d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181\": rpc error: code = NotFound desc = could not find container \"d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181\": container with ID starting with d19b6476007d788c6690ed50b5fa2d8a77319c0083607c90b82ea592d1001181 not found: ID does not exist" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.618346 4803 scope.go:117] "RemoveContainer" containerID="4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b" Mar 20 17:43:19 crc kubenswrapper[4803]: E0320 17:43:19.618612 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b\": container with ID starting with 4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b not found: ID does not exist" containerID="4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b" Mar 20 17:43:19 crc kubenswrapper[4803]: I0320 17:43:19.618654 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b"} err="failed to get container status \"4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b\": rpc error: code = NotFound desc = could not find container \"4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b\": container with ID starting with 4f234cc6002b7c98906d3dd37f3336ba0d176d8cb44f6b46dcb89b43e2bd4f4b not found: ID does not exist" Mar 20 17:43:20 crc kubenswrapper[4803]: I0320 17:43:20.868646 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" path="/var/lib/kubelet/pods/fd0628d1-60c8-45a5-a01e-01c6566e13a7/volumes" Mar 20 17:43:21 crc kubenswrapper[4803]: I0320 17:43:21.394311 4803 scope.go:117] "RemoveContainer" containerID="2c20443cf723ee8f0def494ccc438fd0d53ae77c16b6fc07da8299222a2245b5" Mar 20 17:43:21 crc kubenswrapper[4803]: I0320 17:43:21.436534 4803 scope.go:117] "RemoveContainer" containerID="81a6ad1f271b17b2925da46daa67eb84639642beee0928c9d4179daf0501e2ed" Mar 20 17:43:21 crc kubenswrapper[4803]: I0320 17:43:21.512513 4803 scope.go:117] "RemoveContainer" containerID="ae3678264c6fafa26d473651cf78cccc118b69c9b56377f978a54d4f9553f1e4" Mar 20 17:43:21 crc kubenswrapper[4803]: I0320 17:43:21.576157 4803 scope.go:117] "RemoveContainer" containerID="246a4b0f715b2b9961f880a5b5939f4b4026cb26304f6dc56601a263a444b0a3" Mar 20 17:43:26 crc kubenswrapper[4803]: I0320 17:43:26.849348 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:43:26 crc kubenswrapper[4803]: E0320 17:43:26.850682 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:43:28 crc kubenswrapper[4803]: I0320 17:43:28.532484 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:28 crc kubenswrapper[4803]: I0320 17:43:28.599278 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:28 crc kubenswrapper[4803]: I0320 17:43:28.790737 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lqtw6"] Mar 20 17:43:29 crc kubenswrapper[4803]: I0320 17:43:29.623717 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lqtw6" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="registry-server" containerID="cri-o://6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2" gracePeriod=2 Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.202611 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.278633 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-utilities\") pod \"3b08e67e-e045-4c6e-8eba-69057b4512be\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.278871 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/3b08e67e-e045-4c6e-8eba-69057b4512be-kube-api-access-xc6w9\") pod \"3b08e67e-e045-4c6e-8eba-69057b4512be\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.279126 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-catalog-content\") pod \"3b08e67e-e045-4c6e-8eba-69057b4512be\" (UID: \"3b08e67e-e045-4c6e-8eba-69057b4512be\") " Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.279609 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-utilities" (OuterVolumeSpecName: "utilities") pod "3b08e67e-e045-4c6e-8eba-69057b4512be" (UID: "3b08e67e-e045-4c6e-8eba-69057b4512be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.288105 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b08e67e-e045-4c6e-8eba-69057b4512be-kube-api-access-xc6w9" (OuterVolumeSpecName: "kube-api-access-xc6w9") pod "3b08e67e-e045-4c6e-8eba-69057b4512be" (UID: "3b08e67e-e045-4c6e-8eba-69057b4512be"). InnerVolumeSpecName "kube-api-access-xc6w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.346293 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b08e67e-e045-4c6e-8eba-69057b4512be" (UID: "3b08e67e-e045-4c6e-8eba-69057b4512be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.381912 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.381951 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc6w9\" (UniqueName: \"kubernetes.io/projected/3b08e67e-e045-4c6e-8eba-69057b4512be-kube-api-access-xc6w9\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.381962 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b08e67e-e045-4c6e-8eba-69057b4512be-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.640802 4803 generic.go:334] "Generic (PLEG): container finished" podID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerID="6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2" exitCode=0 Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.640886 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqtw6" event={"ID":"3b08e67e-e045-4c6e-8eba-69057b4512be","Type":"ContainerDied","Data":"6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2"} Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.640966 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lqtw6" event={"ID":"3b08e67e-e045-4c6e-8eba-69057b4512be","Type":"ContainerDied","Data":"ffac6623176f14b0cde4131d217c578ff09ae033348e83207e18f8c68b3b848d"} Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.641013 4803 scope.go:117] "RemoveContainer" containerID="6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.640892 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lqtw6" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.689968 4803 scope.go:117] "RemoveContainer" containerID="cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.719281 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lqtw6"] Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.732465 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lqtw6"] Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.735067 4803 scope.go:117] "RemoveContainer" containerID="b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.798869 4803 scope.go:117] "RemoveContainer" containerID="6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2" Mar 20 17:43:30 crc kubenswrapper[4803]: E0320 17:43:30.800755 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2\": container with ID starting with 6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2 not found: ID does not exist" containerID="6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.800809 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2"} err="failed to get container status \"6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2\": rpc error: code = NotFound desc = could not find container \"6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2\": container with ID starting with 6e372ff484248c35b9c462c4c097f3de311174b833baabffc2e97421d045f5e2 not found: ID does not exist" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.800841 4803 scope.go:117] "RemoveContainer" containerID="cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38" Mar 20 17:43:30 crc kubenswrapper[4803]: E0320 17:43:30.801839 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38\": container with ID starting with cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38 not found: ID does not exist" containerID="cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.801935 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38"} err="failed to get container status \"cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38\": rpc error: code = NotFound desc = could not find container \"cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38\": container with ID starting with cc206512a7c994b4b3f3ca44c4d10e3ba953c2f667fc24eb7340e1d66947ee38 not found: ID does not exist" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.801986 4803 scope.go:117] "RemoveContainer" containerID="b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce" Mar 20 17:43:30 crc kubenswrapper[4803]: E0320 17:43:30.802561 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce\": container with ID starting with b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce not found: ID does not exist" containerID="b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.802608 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce"} err="failed to get container status \"b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce\": rpc error: code = NotFound desc = could not find container \"b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce\": container with ID starting with b89f1844dfc6d750ddee6d719aa1bdeda8dce768224c298f1a12393fc50a9dce not found: ID does not exist" Mar 20 17:43:30 crc kubenswrapper[4803]: I0320 17:43:30.864712 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" path="/var/lib/kubelet/pods/3b08e67e-e045-4c6e-8eba-69057b4512be/volumes" Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.080625 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ht9gr"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.104640 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ea93-account-create-update-k5pbr"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.113114 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kxd5z"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.122220 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3d78-account-create-update-7dwj2"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.130134 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-91c6-account-create-update-pvxqz"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.137486 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4c46w"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.143859 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3d78-account-create-update-7dwj2"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.177541 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ht9gr"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.189666 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ea93-account-create-update-k5pbr"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.206194 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kxd5z"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.218440 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4c46w"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.231641 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-91c6-account-create-update-pvxqz"] Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.869325 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c33e7b8-552f-4505-ab8d-40fe6c121314" path="/var/lib/kubelet/pods/0c33e7b8-552f-4505-ab8d-40fe6c121314/volumes" Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.870894 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64a509f5-e0df-432f-aa48-1c59aa990d09" path="/var/lib/kubelet/pods/64a509f5-e0df-432f-aa48-1c59aa990d09/volumes" Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.872357 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95eec546-fa73-4572-8833-a92c4c020052" path="/var/lib/kubelet/pods/95eec546-fa73-4572-8833-a92c4c020052/volumes" Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.873801 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ea12874-ecdb-41be-a9d0-5d5ee5e57c82" path="/var/lib/kubelet/pods/9ea12874-ecdb-41be-a9d0-5d5ee5e57c82/volumes" Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.876455 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5da372c-0b9a-4cbe-933a-a29f90ef2db6" path="/var/lib/kubelet/pods/b5da372c-0b9a-4cbe-933a-a29f90ef2db6/volumes" Mar 20 17:43:38 crc kubenswrapper[4803]: I0320 17:43:38.877922 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb" path="/var/lib/kubelet/pods/ba41ebed-4eb8-44f5-bcfa-7ac4f32b9afb/volumes" Mar 20 17:43:39 crc kubenswrapper[4803]: I0320 17:43:39.848627 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:43:39 crc kubenswrapper[4803]: E0320 17:43:39.849205 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:43:50 crc kubenswrapper[4803]: I0320 17:43:50.855231 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:43:50 crc kubenswrapper[4803]: E0320 17:43:50.856421 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.040850 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-6j2lv"] Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.052401 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-6j2lv"] Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.146554 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567144-26gb2"] Mar 20 17:44:00 crc kubenswrapper[4803]: E0320 17:44:00.147408 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.147590 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4803]: E0320 17:44:00.147748 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="extract-utilities" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.147867 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="extract-utilities" Mar 20 17:44:00 crc kubenswrapper[4803]: E0320 17:44:00.147998 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="extract-content" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.148112 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="extract-content" Mar 20 17:44:00 crc kubenswrapper[4803]: E0320 17:44:00.148260 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.148373 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4803]: E0320 17:44:00.148589 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerName="extract-content" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.148742 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerName="extract-content" Mar 20 17:44:00 crc kubenswrapper[4803]: E0320 17:44:00.148898 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerName="extract-utilities" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.149015 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerName="extract-utilities" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.149463 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0628d1-60c8-45a5-a01e-01c6566e13a7" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.149707 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b08e67e-e045-4c6e-8eba-69057b4512be" containerName="registry-server" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.150845 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-26gb2" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.154677 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.155194 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.155786 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.157286 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-26gb2"] Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.280588 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwqfd\" (UniqueName: \"kubernetes.io/projected/93f3ef92-1015-4a5c-aebe-556d2e7410f2-kube-api-access-wwqfd\") pod \"auto-csr-approver-29567144-26gb2\" (UID: \"93f3ef92-1015-4a5c-aebe-556d2e7410f2\") " pod="openshift-infra/auto-csr-approver-29567144-26gb2" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.393399 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwqfd\" (UniqueName: \"kubernetes.io/projected/93f3ef92-1015-4a5c-aebe-556d2e7410f2-kube-api-access-wwqfd\") pod \"auto-csr-approver-29567144-26gb2\" (UID: \"93f3ef92-1015-4a5c-aebe-556d2e7410f2\") " pod="openshift-infra/auto-csr-approver-29567144-26gb2" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.413618 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwqfd\" (UniqueName: \"kubernetes.io/projected/93f3ef92-1015-4a5c-aebe-556d2e7410f2-kube-api-access-wwqfd\") pod \"auto-csr-approver-29567144-26gb2\" (UID: \"93f3ef92-1015-4a5c-aebe-556d2e7410f2\") " pod="openshift-infra/auto-csr-approver-29567144-26gb2" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.489002 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-26gb2" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.777053 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-26gb2"] Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.873145 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f747a8-b136-428e-8a84-106573e537db" path="/var/lib/kubelet/pods/97f747a8-b136-428e-8a84-106573e537db/volumes" Mar 20 17:44:00 crc kubenswrapper[4803]: I0320 17:44:00.976290 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-26gb2" event={"ID":"93f3ef92-1015-4a5c-aebe-556d2e7410f2","Type":"ContainerStarted","Data":"3581ac81c1978931cfe40b320792485756db54135b6849722c54ec75f7a2afe0"} Mar 20 17:44:03 crc kubenswrapper[4803]: I0320 17:44:03.013060 4803 generic.go:334] "Generic (PLEG): container finished" podID="93f3ef92-1015-4a5c-aebe-556d2e7410f2" containerID="4e6109e9fb0b0c46c1018e8d06715e4135977bd626b674382406c4a04806d0cb" exitCode=0 Mar 20 17:44:03 crc kubenswrapper[4803]: I0320 17:44:03.013315 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-26gb2" event={"ID":"93f3ef92-1015-4a5c-aebe-556d2e7410f2","Type":"ContainerDied","Data":"4e6109e9fb0b0c46c1018e8d06715e4135977bd626b674382406c4a04806d0cb"} Mar 20 17:44:04 crc kubenswrapper[4803]: I0320 17:44:04.364460 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-26gb2" Mar 20 17:44:04 crc kubenswrapper[4803]: I0320 17:44:04.531458 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwqfd\" (UniqueName: \"kubernetes.io/projected/93f3ef92-1015-4a5c-aebe-556d2e7410f2-kube-api-access-wwqfd\") pod \"93f3ef92-1015-4a5c-aebe-556d2e7410f2\" (UID: \"93f3ef92-1015-4a5c-aebe-556d2e7410f2\") " Mar 20 17:44:04 crc kubenswrapper[4803]: I0320 17:44:04.537319 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f3ef92-1015-4a5c-aebe-556d2e7410f2-kube-api-access-wwqfd" (OuterVolumeSpecName: "kube-api-access-wwqfd") pod "93f3ef92-1015-4a5c-aebe-556d2e7410f2" (UID: "93f3ef92-1015-4a5c-aebe-556d2e7410f2"). InnerVolumeSpecName "kube-api-access-wwqfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:44:04 crc kubenswrapper[4803]: I0320 17:44:04.634926 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwqfd\" (UniqueName: \"kubernetes.io/projected/93f3ef92-1015-4a5c-aebe-556d2e7410f2-kube-api-access-wwqfd\") on node \"crc\" DevicePath \"\"" Mar 20 17:44:05 crc kubenswrapper[4803]: I0320 17:44:05.037679 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567144-26gb2" event={"ID":"93f3ef92-1015-4a5c-aebe-556d2e7410f2","Type":"ContainerDied","Data":"3581ac81c1978931cfe40b320792485756db54135b6849722c54ec75f7a2afe0"} Mar 20 17:44:05 crc kubenswrapper[4803]: I0320 17:44:05.037746 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3581ac81c1978931cfe40b320792485756db54135b6849722c54ec75f7a2afe0" Mar 20 17:44:05 crc kubenswrapper[4803]: I0320 17:44:05.037818 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567144-26gb2" Mar 20 17:44:05 crc kubenswrapper[4803]: I0320 17:44:05.451875 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-qz74x"] Mar 20 17:44:05 crc kubenswrapper[4803]: I0320 17:44:05.461471 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567138-qz74x"] Mar 20 17:44:05 crc kubenswrapper[4803]: I0320 17:44:05.849363 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:44:05 crc kubenswrapper[4803]: E0320 17:44:05.849861 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:44:06 crc kubenswrapper[4803]: I0320 17:44:06.866115 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c6fd02-f9d2-4a83-8042-5bbf9c583f51" path="/var/lib/kubelet/pods/d9c6fd02-f9d2-4a83-8042-5bbf9c583f51/volumes" Mar 20 17:44:09 crc kubenswrapper[4803]: I0320 17:44:09.040355 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ww57h"] Mar 20 17:44:09 crc kubenswrapper[4803]: I0320 17:44:09.055248 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ww57h"] Mar 20 17:44:10 crc kubenswrapper[4803]: I0320 17:44:10.867868 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7c27f17-4261-4a5c-830f-687a37c483fe" path="/var/lib/kubelet/pods/b7c27f17-4261-4a5c-830f-687a37c483fe/volumes" Mar 20 17:44:13 crc kubenswrapper[4803]: I0320 17:44:13.042869 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a62f-account-create-update-gxqzm"] Mar 20 17:44:13 crc kubenswrapper[4803]: I0320 17:44:13.057922 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-9kd4s"] Mar 20 17:44:13 crc kubenswrapper[4803]: I0320 17:44:13.067575 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a62f-account-create-update-gxqzm"] Mar 20 17:44:13 crc kubenswrapper[4803]: I0320 17:44:13.077235 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-9kd4s"] Mar 20 17:44:13 crc kubenswrapper[4803]: I0320 17:44:13.149330 4803 generic.go:334] "Generic (PLEG): container finished" podID="c2c45e4b-27e5-4614-875f-838444583617" containerID="c567efd90c79dafa7ea9860215626b074c8400f14589efd3f83c6b4a33c5e0ff" exitCode=0 Mar 20 17:44:13 crc kubenswrapper[4803]: I0320 17:44:13.149429 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" event={"ID":"c2c45e4b-27e5-4614-875f-838444583617","Type":"ContainerDied","Data":"c567efd90c79dafa7ea9860215626b074c8400f14589efd3f83c6b4a33c5e0ff"} Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.723580 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.838392 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6znfb\" (UniqueName: \"kubernetes.io/projected/c2c45e4b-27e5-4614-875f-838444583617-kube-api-access-6znfb\") pod \"c2c45e4b-27e5-4614-875f-838444583617\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.838581 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-ssh-key-openstack-edpm-ipam\") pod \"c2c45e4b-27e5-4614-875f-838444583617\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.838702 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-inventory\") pod \"c2c45e4b-27e5-4614-875f-838444583617\" (UID: \"c2c45e4b-27e5-4614-875f-838444583617\") " Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.846989 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c45e4b-27e5-4614-875f-838444583617-kube-api-access-6znfb" (OuterVolumeSpecName: "kube-api-access-6znfb") pod "c2c45e4b-27e5-4614-875f-838444583617" (UID: "c2c45e4b-27e5-4614-875f-838444583617"). InnerVolumeSpecName "kube-api-access-6znfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.868917 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea23ab0-9514-4256-aa1b-d477da1a19fe" path="/var/lib/kubelet/pods/6ea23ab0-9514-4256-aa1b-d477da1a19fe/volumes" Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.869542 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb3f389-31c1-4ada-838c-ccef884cc082" path="/var/lib/kubelet/pods/afb3f389-31c1-4ada-838c-ccef884cc082/volumes" Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.891803 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-inventory" (OuterVolumeSpecName: "inventory") pod "c2c45e4b-27e5-4614-875f-838444583617" (UID: "c2c45e4b-27e5-4614-875f-838444583617"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.892015 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c2c45e4b-27e5-4614-875f-838444583617" (UID: "c2c45e4b-27e5-4614-875f-838444583617"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.942791 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.942861 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6znfb\" (UniqueName: \"kubernetes.io/projected/c2c45e4b-27e5-4614-875f-838444583617-kube-api-access-6znfb\") on node \"crc\" DevicePath \"\"" Mar 20 17:44:14 crc kubenswrapper[4803]: I0320 17:44:14.942895 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c2c45e4b-27e5-4614-875f-838444583617-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.179752 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" event={"ID":"c2c45e4b-27e5-4614-875f-838444583617","Type":"ContainerDied","Data":"0cecb3581a66907a8803773be4910abff5e60d1426f8da0f6a9c67fb8fe3beba"} Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.179814 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cecb3581a66907a8803773be4910abff5e60d1426f8da0f6a9c67fb8fe3beba" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.179901 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.276385 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5"] Mar 20 17:44:15 crc kubenswrapper[4803]: E0320 17:44:15.277060 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2c45e4b-27e5-4614-875f-838444583617" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.277185 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2c45e4b-27e5-4614-875f-838444583617" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:44:15 crc kubenswrapper[4803]: E0320 17:44:15.277291 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f3ef92-1015-4a5c-aebe-556d2e7410f2" containerName="oc" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.277361 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f3ef92-1015-4a5c-aebe-556d2e7410f2" containerName="oc" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.277643 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2c45e4b-27e5-4614-875f-838444583617" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.277785 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f3ef92-1015-4a5c-aebe-556d2e7410f2" containerName="oc" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.278804 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.284942 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.287234 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.289030 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.289151 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.297087 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5"] Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.452911 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.453049 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.453313 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22xdc\" (UniqueName: \"kubernetes.io/projected/04e7436c-88e7-4d1b-b078-cbee6adf422d-kube-api-access-22xdc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.555932 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.556766 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.557009 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22xdc\" (UniqueName: \"kubernetes.io/projected/04e7436c-88e7-4d1b-b078-cbee6adf422d-kube-api-access-22xdc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.566825 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.566888 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.580841 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22xdc\" (UniqueName: \"kubernetes.io/projected/04e7436c-88e7-4d1b-b078-cbee6adf422d-kube-api-access-22xdc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:15 crc kubenswrapper[4803]: I0320 17:44:15.613591 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:44:16 crc kubenswrapper[4803]: I0320 17:44:16.185860 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.062745 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-tq5tc"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.080385 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h24x7"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.090255 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-4eea-account-create-update-lxf2n"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.099149 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e380-account-create-update-9mlv9"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.105337 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-4eea-account-create-update-lxf2n"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.111203 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h24x7"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.117600 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-tq5tc"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.124009 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e380-account-create-update-9mlv9"] Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.204417 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" event={"ID":"04e7436c-88e7-4d1b-b078-cbee6adf422d","Type":"ContainerStarted","Data":"2eec38210f3b1d85f53bb46050660d6615e5312c3a01ed562b094bbdbb443b38"} Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.204504 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" event={"ID":"04e7436c-88e7-4d1b-b078-cbee6adf422d","Type":"ContainerStarted","Data":"2a076a757063c6bc3312f2f900136ce195b36e596d53bfa8f0ff9f370ee1794d"} Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.230073 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" podStartSLOduration=1.769627772 podStartE2EDuration="2.230053901s" podCreationTimestamp="2026-03-20 17:44:15 +0000 UTC" firstStartedPulling="2026-03-20 17:44:16.19033532 +0000 UTC m=+1666.101927430" lastFinishedPulling="2026-03-20 17:44:16.650761489 +0000 UTC m=+1666.562353559" observedRunningTime="2026-03-20 17:44:17.227587893 +0000 UTC m=+1667.139180043" watchObservedRunningTime="2026-03-20 17:44:17.230053901 +0000 UTC m=+1667.141645971" Mar 20 17:44:17 crc kubenswrapper[4803]: I0320 17:44:17.849783 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:44:17 crc kubenswrapper[4803]: E0320 17:44:17.850203 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:44:18 crc kubenswrapper[4803]: I0320 17:44:18.868288 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc" path="/var/lib/kubelet/pods/9a0adb59-d5ac-4535-b5f9-2fd0e0ec48bc/volumes" Mar 20 17:44:18 crc kubenswrapper[4803]: I0320 17:44:18.869910 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1d16ab-bd85-4b4b-8862-19f134432523" path="/var/lib/kubelet/pods/bf1d16ab-bd85-4b4b-8862-19f134432523/volumes" Mar 20 17:44:18 crc kubenswrapper[4803]: I0320 17:44:18.871101 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b50998-7271-4d76-bf90-7a23ab8ae295" path="/var/lib/kubelet/pods/d9b50998-7271-4d76-bf90-7a23ab8ae295/volumes" Mar 20 17:44:18 crc kubenswrapper[4803]: I0320 17:44:18.872248 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f19fe4f0-4b51-440d-82ad-28541b098fc4" path="/var/lib/kubelet/pods/f19fe4f0-4b51-440d-82ad-28541b098fc4/volumes" Mar 20 17:44:21 crc kubenswrapper[4803]: I0320 17:44:21.671045 4803 scope.go:117] "RemoveContainer" containerID="45b0dbf2ad3242b1295eaee87dce821ddb72767b65fa42e6566430ee2ec75449" Mar 20 17:44:21 crc kubenswrapper[4803]: I0320 17:44:21.709093 4803 scope.go:117] "RemoveContainer" containerID="0a01bbe65e7decf9a49a95da813e2ccfce4aa71e1bb177902eaaa6e71375b94f" Mar 20 17:44:21 crc kubenswrapper[4803]: I0320 17:44:21.776407 4803 scope.go:117] "RemoveContainer" containerID="55a8a2be8373fbfafefdb8b0b3bb484de9bba9af837f6c1e0ae3a642c48a1793" Mar 20 17:44:21 crc kubenswrapper[4803]: I0320 17:44:21.814294 4803 scope.go:117] "RemoveContainer" containerID="77674519c2307d61508c3fcc6ae48715c4d07ed315245c2b437b24b6d24d20e4" Mar 20 17:44:21 crc kubenswrapper[4803]: I0320 17:44:21.886423 4803 scope.go:117] "RemoveContainer" containerID="054a8d5e8e8739faae07ac7fe79d328ced4d370f2b2fef4dc90a20397c2e0daf" Mar 20 17:44:21 crc kubenswrapper[4803]: I0320 17:44:21.932948 4803 scope.go:117] "RemoveContainer" containerID="e453ee4325cb778a15e6f9d4625eb9c198bef9e3322946ecea40b0215628f412" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.003008 4803 scope.go:117] "RemoveContainer" containerID="1950163278c61edad4a406e974e57dc2848599bf8e0a7c395668f62546453be9" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.030262 4803 scope.go:117] "RemoveContainer" containerID="6f9fd10105318d191923863006281bacde7b8b94ad39448a0cd58552195a922d" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.045312 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-sc6nk"] Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.058973 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-sc6nk"] Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.067234 4803 scope.go:117] "RemoveContainer" containerID="346af28290c06d13505c4d476f70328afa1c19d4692c00d605035136c8f87c9b" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.111402 4803 scope.go:117] "RemoveContainer" containerID="9b7f4335a75ea34ea1b5be584102f7e23bc75a644f114489b39bdea7f4a77f5f" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.135383 4803 scope.go:117] "RemoveContainer" containerID="94758365f09a0875ee0eca12212127bcd6701e737b8d48c3154a20b4ac15a0c8" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.155052 4803 scope.go:117] "RemoveContainer" containerID="ab2bef885a472f8999e3ab99d73dad338bfd52e3c51613776827341946719d14" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.176964 4803 scope.go:117] "RemoveContainer" containerID="2ed94c68c07d9b2f49e25944f7987b5350638170c181ca2258d9ccc97cf4ca3e" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.204273 4803 scope.go:117] "RemoveContainer" containerID="eb7c4c6fe2cd60e6bc64901df51792ffc3d0aeb5cbfa1dd687e92203af9cdaf2" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.241290 4803 scope.go:117] "RemoveContainer" containerID="7c956abc5b56c59be9108d5a8dfde4852838a3032ca83a8d03d0298c2d65df92" Mar 20 17:44:22 crc kubenswrapper[4803]: I0320 17:44:22.862829 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2" path="/var/lib/kubelet/pods/07ee6105-cfb6-4e86-9a7f-bfa66d89aaf2/volumes" Mar 20 17:44:30 crc kubenswrapper[4803]: I0320 17:44:30.856139 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:44:30 crc kubenswrapper[4803]: E0320 17:44:30.858893 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:44:41 crc kubenswrapper[4803]: I0320 17:44:41.848005 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:44:41 crc kubenswrapper[4803]: E0320 17:44:41.848803 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:44:53 crc kubenswrapper[4803]: I0320 17:44:53.081116 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-h45zt"] Mar 20 17:44:53 crc kubenswrapper[4803]: I0320 17:44:53.090396 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-h45zt"] Mar 20 17:44:54 crc kubenswrapper[4803]: I0320 17:44:54.870895 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ff4df91-5788-4dc9-a817-6c6a41bb955c" path="/var/lib/kubelet/pods/1ff4df91-5788-4dc9-a817-6c6a41bb955c/volumes" Mar 20 17:44:55 crc kubenswrapper[4803]: I0320 17:44:55.848194 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:44:55 crc kubenswrapper[4803]: E0320 17:44:55.848438 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.147239 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6"] Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.148743 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.151780 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.151906 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.163318 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6"] Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.266477 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzvqg\" (UniqueName: \"kubernetes.io/projected/800f6249-746f-4ac2-9bab-30998a996ac2-kube-api-access-nzvqg\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.266813 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/800f6249-746f-4ac2-9bab-30998a996ac2-config-volume\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.266836 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/800f6249-746f-4ac2-9bab-30998a996ac2-secret-volume\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.368255 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzvqg\" (UniqueName: \"kubernetes.io/projected/800f6249-746f-4ac2-9bab-30998a996ac2-kube-api-access-nzvqg\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.368304 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/800f6249-746f-4ac2-9bab-30998a996ac2-config-volume\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.368328 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/800f6249-746f-4ac2-9bab-30998a996ac2-secret-volume\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.369324 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/800f6249-746f-4ac2-9bab-30998a996ac2-config-volume\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.379008 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/800f6249-746f-4ac2-9bab-30998a996ac2-secret-volume\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.387891 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzvqg\" (UniqueName: \"kubernetes.io/projected/800f6249-746f-4ac2-9bab-30998a996ac2-kube-api-access-nzvqg\") pod \"collect-profiles-29567145-wwwq6\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.479656 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:00 crc kubenswrapper[4803]: I0320 17:45:00.906966 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6"] Mar 20 17:45:00 crc kubenswrapper[4803]: W0320 17:45:00.916717 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod800f6249_746f_4ac2_9bab_30998a996ac2.slice/crio-867240765e436d9cf59c034e1231909040c1ec0ddea096b7d90dc46fb9815b42 WatchSource:0}: Error finding container 867240765e436d9cf59c034e1231909040c1ec0ddea096b7d90dc46fb9815b42: Status 404 returned error can't find the container with id 867240765e436d9cf59c034e1231909040c1ec0ddea096b7d90dc46fb9815b42 Mar 20 17:45:01 crc kubenswrapper[4803]: I0320 17:45:01.760867 4803 generic.go:334] "Generic (PLEG): container finished" podID="800f6249-746f-4ac2-9bab-30998a996ac2" containerID="f39a7d68a5f754507f4b500e0ca0897cc100176caaa71f1f01f153ed34f2e494" exitCode=0 Mar 20 17:45:01 crc kubenswrapper[4803]: I0320 17:45:01.760939 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" event={"ID":"800f6249-746f-4ac2-9bab-30998a996ac2","Type":"ContainerDied","Data":"f39a7d68a5f754507f4b500e0ca0897cc100176caaa71f1f01f153ed34f2e494"} Mar 20 17:45:01 crc kubenswrapper[4803]: I0320 17:45:01.760977 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" event={"ID":"800f6249-746f-4ac2-9bab-30998a996ac2","Type":"ContainerStarted","Data":"867240765e436d9cf59c034e1231909040c1ec0ddea096b7d90dc46fb9815b42"} Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.094666 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.229441 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/800f6249-746f-4ac2-9bab-30998a996ac2-secret-volume\") pod \"800f6249-746f-4ac2-9bab-30998a996ac2\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.229594 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/800f6249-746f-4ac2-9bab-30998a996ac2-config-volume\") pod \"800f6249-746f-4ac2-9bab-30998a996ac2\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.229818 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzvqg\" (UniqueName: \"kubernetes.io/projected/800f6249-746f-4ac2-9bab-30998a996ac2-kube-api-access-nzvqg\") pod \"800f6249-746f-4ac2-9bab-30998a996ac2\" (UID: \"800f6249-746f-4ac2-9bab-30998a996ac2\") " Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.230252 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/800f6249-746f-4ac2-9bab-30998a996ac2-config-volume" (OuterVolumeSpecName: "config-volume") pod "800f6249-746f-4ac2-9bab-30998a996ac2" (UID: "800f6249-746f-4ac2-9bab-30998a996ac2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.230954 4803 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/800f6249-746f-4ac2-9bab-30998a996ac2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.235970 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/800f6249-746f-4ac2-9bab-30998a996ac2-kube-api-access-nzvqg" (OuterVolumeSpecName: "kube-api-access-nzvqg") pod "800f6249-746f-4ac2-9bab-30998a996ac2" (UID: "800f6249-746f-4ac2-9bab-30998a996ac2"). InnerVolumeSpecName "kube-api-access-nzvqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.236978 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/800f6249-746f-4ac2-9bab-30998a996ac2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "800f6249-746f-4ac2-9bab-30998a996ac2" (UID: "800f6249-746f-4ac2-9bab-30998a996ac2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.333316 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzvqg\" (UniqueName: \"kubernetes.io/projected/800f6249-746f-4ac2-9bab-30998a996ac2-kube-api-access-nzvqg\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.333371 4803 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/800f6249-746f-4ac2-9bab-30998a996ac2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.779233 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" event={"ID":"800f6249-746f-4ac2-9bab-30998a996ac2","Type":"ContainerDied","Data":"867240765e436d9cf59c034e1231909040c1ec0ddea096b7d90dc46fb9815b42"} Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.779580 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="867240765e436d9cf59c034e1231909040c1ec0ddea096b7d90dc46fb9815b42" Mar 20 17:45:03 crc kubenswrapper[4803]: I0320 17:45:03.779270 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6" Mar 20 17:45:04 crc kubenswrapper[4803]: I0320 17:45:04.028582 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-75rtg"] Mar 20 17:45:04 crc kubenswrapper[4803]: I0320 17:45:04.036518 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-75rtg"] Mar 20 17:45:04 crc kubenswrapper[4803]: I0320 17:45:04.874877 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d002f5-379f-4709-a3df-aeb253a8884b" path="/var/lib/kubelet/pods/e6d002f5-379f-4709-a3df-aeb253a8884b/volumes" Mar 20 17:45:07 crc kubenswrapper[4803]: I0320 17:45:07.038372 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xfvq7"] Mar 20 17:45:07 crc kubenswrapper[4803]: I0320 17:45:07.056505 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xfvq7"] Mar 20 17:45:08 crc kubenswrapper[4803]: I0320 17:45:08.848896 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:45:08 crc kubenswrapper[4803]: E0320 17:45:08.849607 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:45:08 crc kubenswrapper[4803]: I0320 17:45:08.860593 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d182f2-4d18-46d2-b9a2-349bf6ddb311" path="/var/lib/kubelet/pods/a8d182f2-4d18-46d2-b9a2-349bf6ddb311/volumes" Mar 20 17:45:12 crc kubenswrapper[4803]: I0320 17:45:12.029692 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-xm2fq"] Mar 20 17:45:12 crc kubenswrapper[4803]: I0320 17:45:12.037814 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-xm2fq"] Mar 20 17:45:12 crc kubenswrapper[4803]: I0320 17:45:12.863773 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f3de51a-19ff-4714-b839-921efeeb3e48" path="/var/lib/kubelet/pods/1f3de51a-19ff-4714-b839-921efeeb3e48/volumes" Mar 20 17:45:18 crc kubenswrapper[4803]: I0320 17:45:18.029025 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dtdjf"] Mar 20 17:45:18 crc kubenswrapper[4803]: I0320 17:45:18.042068 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dtdjf"] Mar 20 17:45:18 crc kubenswrapper[4803]: I0320 17:45:18.862088 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c1156a-5e7a-4547-8a7b-46a55651b7a7" path="/var/lib/kubelet/pods/56c1156a-5e7a-4547-8a7b-46a55651b7a7/volumes" Mar 20 17:45:21 crc kubenswrapper[4803]: I0320 17:45:21.848866 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:45:21 crc kubenswrapper[4803]: E0320 17:45:21.849414 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:45:21 crc kubenswrapper[4803]: I0320 17:45:21.943517 4803 generic.go:334] "Generic (PLEG): container finished" podID="04e7436c-88e7-4d1b-b078-cbee6adf422d" containerID="2eec38210f3b1d85f53bb46050660d6615e5312c3a01ed562b094bbdbb443b38" exitCode=0 Mar 20 17:45:21 crc kubenswrapper[4803]: I0320 17:45:21.943603 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" event={"ID":"04e7436c-88e7-4d1b-b078-cbee6adf422d","Type":"ContainerDied","Data":"2eec38210f3b1d85f53bb46050660d6615e5312c3a01ed562b094bbdbb443b38"} Mar 20 17:45:22 crc kubenswrapper[4803]: I0320 17:45:22.562897 4803 scope.go:117] "RemoveContainer" containerID="fe13e8955f038320902fb18b2c56d5aa4a644899a2f52e28c950ffdc85579fe7" Mar 20 17:45:22 crc kubenswrapper[4803]: I0320 17:45:22.599882 4803 scope.go:117] "RemoveContainer" containerID="05bc60563eba9143e7f0fa1da073b73155603ad561d734a6ab388763c0f62456" Mar 20 17:45:22 crc kubenswrapper[4803]: I0320 17:45:22.670230 4803 scope.go:117] "RemoveContainer" containerID="ca6c4f87fdaed092282729f2a14553e6dc9d18b1431d15a978ef85cf1dbfdce4" Mar 20 17:45:22 crc kubenswrapper[4803]: I0320 17:45:22.718786 4803 scope.go:117] "RemoveContainer" containerID="097c610e66be84a58a9675df68e757e52647d5dbc7c78f7a7f93c2338a1b9104" Mar 20 17:45:22 crc kubenswrapper[4803]: I0320 17:45:22.765428 4803 scope.go:117] "RemoveContainer" containerID="b94a88755bb30e8fd224eaa50df2b53d3a5138c0c490af81802bb0e41335d468" Mar 20 17:45:22 crc kubenswrapper[4803]: I0320 17:45:22.788164 4803 scope.go:117] "RemoveContainer" containerID="482f2cd5e4eeff084f774916496418bf7d97bc2732fa016bab2f67e9a1e087d5" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.269447 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.334912 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-inventory\") pod \"04e7436c-88e7-4d1b-b078-cbee6adf422d\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.335242 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22xdc\" (UniqueName: \"kubernetes.io/projected/04e7436c-88e7-4d1b-b078-cbee6adf422d-kube-api-access-22xdc\") pod \"04e7436c-88e7-4d1b-b078-cbee6adf422d\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.335355 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-ssh-key-openstack-edpm-ipam\") pod \"04e7436c-88e7-4d1b-b078-cbee6adf422d\" (UID: \"04e7436c-88e7-4d1b-b078-cbee6adf422d\") " Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.344907 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e7436c-88e7-4d1b-b078-cbee6adf422d-kube-api-access-22xdc" (OuterVolumeSpecName: "kube-api-access-22xdc") pod "04e7436c-88e7-4d1b-b078-cbee6adf422d" (UID: "04e7436c-88e7-4d1b-b078-cbee6adf422d"). InnerVolumeSpecName "kube-api-access-22xdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.386496 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-inventory" (OuterVolumeSpecName: "inventory") pod "04e7436c-88e7-4d1b-b078-cbee6adf422d" (UID: "04e7436c-88e7-4d1b-b078-cbee6adf422d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.387787 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04e7436c-88e7-4d1b-b078-cbee6adf422d" (UID: "04e7436c-88e7-4d1b-b078-cbee6adf422d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.439335 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.439396 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22xdc\" (UniqueName: \"kubernetes.io/projected/04e7436c-88e7-4d1b-b078-cbee6adf422d-kube-api-access-22xdc\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.439419 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04e7436c-88e7-4d1b-b078-cbee6adf422d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.965710 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" event={"ID":"04e7436c-88e7-4d1b-b078-cbee6adf422d","Type":"ContainerDied","Data":"2a076a757063c6bc3312f2f900136ce195b36e596d53bfa8f0ff9f370ee1794d"} Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.965752 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a076a757063c6bc3312f2f900136ce195b36e596d53bfa8f0ff9f370ee1794d" Mar 20 17:45:23 crc kubenswrapper[4803]: I0320 17:45:23.965808 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.076646 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr"] Mar 20 17:45:24 crc kubenswrapper[4803]: E0320 17:45:24.077167 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800f6249-746f-4ac2-9bab-30998a996ac2" containerName="collect-profiles" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.077188 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="800f6249-746f-4ac2-9bab-30998a996ac2" containerName="collect-profiles" Mar 20 17:45:24 crc kubenswrapper[4803]: E0320 17:45:24.077205 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e7436c-88e7-4d1b-b078-cbee6adf422d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.077212 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e7436c-88e7-4d1b-b078-cbee6adf422d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.077391 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="800f6249-746f-4ac2-9bab-30998a996ac2" containerName="collect-profiles" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.077408 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e7436c-88e7-4d1b-b078-cbee6adf422d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.078053 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.083191 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.083381 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.084136 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.084339 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.108560 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr"] Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.152593 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.152726 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grgtg\" (UniqueName: \"kubernetes.io/projected/64a2f129-704a-4165-88ee-2cab50cc59d9-kube-api-access-grgtg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.152758 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.254022 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.254151 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grgtg\" (UniqueName: \"kubernetes.io/projected/64a2f129-704a-4165-88ee-2cab50cc59d9-kube-api-access-grgtg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.254180 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.258771 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.260256 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.274197 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grgtg\" (UniqueName: \"kubernetes.io/projected/64a2f129-704a-4165-88ee-2cab50cc59d9-kube-api-access-grgtg\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:24 crc kubenswrapper[4803]: I0320 17:45:24.409015 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:25 crc kubenswrapper[4803]: I0320 17:45:25.037869 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr"] Mar 20 17:45:25 crc kubenswrapper[4803]: I0320 17:45:25.043763 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:45:25 crc kubenswrapper[4803]: I0320 17:45:25.990080 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" event={"ID":"64a2f129-704a-4165-88ee-2cab50cc59d9","Type":"ContainerStarted","Data":"71efd90537833ec1275fd3b28a61414546b44715cfa5d63c0f5b5b2ba94e0609"} Mar 20 17:45:25 crc kubenswrapper[4803]: I0320 17:45:25.993791 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" event={"ID":"64a2f129-704a-4165-88ee-2cab50cc59d9","Type":"ContainerStarted","Data":"a9ac83062991bfb38d612f7c9594e21fc85a15aac9f32157ac8b964df71dc751"} Mar 20 17:45:26 crc kubenswrapper[4803]: I0320 17:45:26.020061 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" podStartSLOduration=1.440431195 podStartE2EDuration="2.020030546s" podCreationTimestamp="2026-03-20 17:45:24 +0000 UTC" firstStartedPulling="2026-03-20 17:45:25.043344707 +0000 UTC m=+1734.954936787" lastFinishedPulling="2026-03-20 17:45:25.622944058 +0000 UTC m=+1735.534536138" observedRunningTime="2026-03-20 17:45:26.009043922 +0000 UTC m=+1735.920636042" watchObservedRunningTime="2026-03-20 17:45:26.020030546 +0000 UTC m=+1735.931622656" Mar 20 17:45:31 crc kubenswrapper[4803]: I0320 17:45:31.059634 4803 generic.go:334] "Generic (PLEG): container finished" podID="64a2f129-704a-4165-88ee-2cab50cc59d9" containerID="71efd90537833ec1275fd3b28a61414546b44715cfa5d63c0f5b5b2ba94e0609" exitCode=0 Mar 20 17:45:31 crc kubenswrapper[4803]: I0320 17:45:31.059766 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" event={"ID":"64a2f129-704a-4165-88ee-2cab50cc59d9","Type":"ContainerDied","Data":"71efd90537833ec1275fd3b28a61414546b44715cfa5d63c0f5b5b2ba94e0609"} Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.506008 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.558502 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-ssh-key-openstack-edpm-ipam\") pod \"64a2f129-704a-4165-88ee-2cab50cc59d9\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.558616 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-inventory\") pod \"64a2f129-704a-4165-88ee-2cab50cc59d9\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.558771 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grgtg\" (UniqueName: \"kubernetes.io/projected/64a2f129-704a-4165-88ee-2cab50cc59d9-kube-api-access-grgtg\") pod \"64a2f129-704a-4165-88ee-2cab50cc59d9\" (UID: \"64a2f129-704a-4165-88ee-2cab50cc59d9\") " Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.563976 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64a2f129-704a-4165-88ee-2cab50cc59d9-kube-api-access-grgtg" (OuterVolumeSpecName: "kube-api-access-grgtg") pod "64a2f129-704a-4165-88ee-2cab50cc59d9" (UID: "64a2f129-704a-4165-88ee-2cab50cc59d9"). InnerVolumeSpecName "kube-api-access-grgtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.591430 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-inventory" (OuterVolumeSpecName: "inventory") pod "64a2f129-704a-4165-88ee-2cab50cc59d9" (UID: "64a2f129-704a-4165-88ee-2cab50cc59d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.598457 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "64a2f129-704a-4165-88ee-2cab50cc59d9" (UID: "64a2f129-704a-4165-88ee-2cab50cc59d9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.661258 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grgtg\" (UniqueName: \"kubernetes.io/projected/64a2f129-704a-4165-88ee-2cab50cc59d9-kube-api-access-grgtg\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.661305 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:32 crc kubenswrapper[4803]: I0320 17:45:32.661320 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/64a2f129-704a-4165-88ee-2cab50cc59d9-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.098420 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" event={"ID":"64a2f129-704a-4165-88ee-2cab50cc59d9","Type":"ContainerDied","Data":"a9ac83062991bfb38d612f7c9594e21fc85a15aac9f32157ac8b964df71dc751"} Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.098465 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.098474 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9ac83062991bfb38d612f7c9594e21fc85a15aac9f32157ac8b964df71dc751" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.182218 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7"] Mar 20 17:45:33 crc kubenswrapper[4803]: E0320 17:45:33.183499 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64a2f129-704a-4165-88ee-2cab50cc59d9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.183573 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="64a2f129-704a-4165-88ee-2cab50cc59d9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.184048 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="64a2f129-704a-4165-88ee-2cab50cc59d9" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.185551 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.189413 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.189654 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.190109 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.190273 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.204056 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7"] Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.374959 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brzhn\" (UniqueName: \"kubernetes.io/projected/de5b2da1-3a72-4b62-98a5-71352eb71c90-kube-api-access-brzhn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.375003 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.375028 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.476882 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.476925 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.477090 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brzhn\" (UniqueName: \"kubernetes.io/projected/de5b2da1-3a72-4b62-98a5-71352eb71c90-kube-api-access-brzhn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.482697 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.490434 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.498456 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brzhn\" (UniqueName: \"kubernetes.io/projected/de5b2da1-3a72-4b62-98a5-71352eb71c90-kube-api-access-brzhn\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-26xq7\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:33 crc kubenswrapper[4803]: I0320 17:45:33.507699 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:45:34 crc kubenswrapper[4803]: I0320 17:45:34.139691 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7"] Mar 20 17:45:35 crc kubenswrapper[4803]: I0320 17:45:35.120901 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" event={"ID":"de5b2da1-3a72-4b62-98a5-71352eb71c90","Type":"ContainerStarted","Data":"5a976a793739301c6ab5cb2f8e14e83e3febb20a92c0a31b7374bb7a952cab8e"} Mar 20 17:45:35 crc kubenswrapper[4803]: I0320 17:45:35.121414 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" event={"ID":"de5b2da1-3a72-4b62-98a5-71352eb71c90","Type":"ContainerStarted","Data":"b80ad36e55412fe39e9485734c06231e2d63ca0edeba6069ae2c622b13848d10"} Mar 20 17:45:35 crc kubenswrapper[4803]: I0320 17:45:35.141507 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" podStartSLOduration=1.7193464170000001 podStartE2EDuration="2.141483021s" podCreationTimestamp="2026-03-20 17:45:33 +0000 UTC" firstStartedPulling="2026-03-20 17:45:34.158642825 +0000 UTC m=+1744.070234905" lastFinishedPulling="2026-03-20 17:45:34.580779439 +0000 UTC m=+1744.492371509" observedRunningTime="2026-03-20 17:45:35.138180657 +0000 UTC m=+1745.049772767" watchObservedRunningTime="2026-03-20 17:45:35.141483021 +0000 UTC m=+1745.053075111" Mar 20 17:45:35 crc kubenswrapper[4803]: I0320 17:45:35.848788 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:45:35 crc kubenswrapper[4803]: E0320 17:45:35.849335 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:45:46 crc kubenswrapper[4803]: I0320 17:45:46.848796 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:45:46 crc kubenswrapper[4803]: E0320 17:45:46.849634 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:45:59 crc kubenswrapper[4803]: I0320 17:45:59.052032 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9zd4x"] Mar 20 17:45:59 crc kubenswrapper[4803]: I0320 17:45:59.059987 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9zd4x"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.034689 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-da07-account-create-update-9c6n2"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.043819 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-da07-account-create-update-9c6n2"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.052789 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-t2hrj"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.061435 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-t2hrj"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.068875 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-x8dss"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.075336 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ded6-account-create-update-b2sc5"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.081638 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ded6-account-create-update-b2sc5"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.087755 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-x8dss"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.145003 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567146-rkt8v"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.146138 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-rkt8v" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.147651 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.148650 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.152465 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.156004 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-rkt8v"] Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.254253 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgwcg\" (UniqueName: \"kubernetes.io/projected/9ac9282e-f66c-4b81-9145-e69a7924619a-kube-api-access-fgwcg\") pod \"auto-csr-approver-29567146-rkt8v\" (UID: \"9ac9282e-f66c-4b81-9145-e69a7924619a\") " pod="openshift-infra/auto-csr-approver-29567146-rkt8v" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.356102 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgwcg\" (UniqueName: \"kubernetes.io/projected/9ac9282e-f66c-4b81-9145-e69a7924619a-kube-api-access-fgwcg\") pod \"auto-csr-approver-29567146-rkt8v\" (UID: \"9ac9282e-f66c-4b81-9145-e69a7924619a\") " pod="openshift-infra/auto-csr-approver-29567146-rkt8v" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.375395 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgwcg\" (UniqueName: \"kubernetes.io/projected/9ac9282e-f66c-4b81-9145-e69a7924619a-kube-api-access-fgwcg\") pod \"auto-csr-approver-29567146-rkt8v\" (UID: \"9ac9282e-f66c-4b81-9145-e69a7924619a\") " pod="openshift-infra/auto-csr-approver-29567146-rkt8v" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.502066 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-rkt8v" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.941940 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a45b6fb6-a659-4a13-96d8-2f41634e3423" path="/var/lib/kubelet/pods/a45b6fb6-a659-4a13-96d8-2f41634e3423/volumes" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.942995 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad456b82-4bac-43be-b409-afcea1169a3a" path="/var/lib/kubelet/pods/ad456b82-4bac-43be-b409-afcea1169a3a/volumes" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.943592 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae48d528-5867-44e8-a58b-7a035fd1c0a7" path="/var/lib/kubelet/pods/ae48d528-5867-44e8-a58b-7a035fd1c0a7/volumes" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.944242 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8fd306c-b4b4-4f8a-812c-4434756c38dc" path="/var/lib/kubelet/pods/b8fd306c-b4b4-4f8a-812c-4434756c38dc/volumes" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.948420 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7fbac44-b743-4049-ae33-92d8fdb12dfd" path="/var/lib/kubelet/pods/e7fbac44-b743-4049-ae33-92d8fdb12dfd/volumes" Mar 20 17:46:00 crc kubenswrapper[4803]: I0320 17:46:00.968348 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-rkt8v"] Mar 20 17:46:01 crc kubenswrapper[4803]: I0320 17:46:01.025658 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-055e-account-create-update-qzqpx"] Mar 20 17:46:01 crc kubenswrapper[4803]: I0320 17:46:01.038680 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-055e-account-create-update-qzqpx"] Mar 20 17:46:01 crc kubenswrapper[4803]: I0320 17:46:01.391209 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-rkt8v" event={"ID":"9ac9282e-f66c-4b81-9145-e69a7924619a","Type":"ContainerStarted","Data":"b0b457bd0c0db6f6138efdfb5798689e984f9e783e49bb3372f250c6bfab2906"} Mar 20 17:46:01 crc kubenswrapper[4803]: I0320 17:46:01.848411 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:46:01 crc kubenswrapper[4803]: E0320 17:46:01.848856 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:46:02 crc kubenswrapper[4803]: I0320 17:46:02.866947 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f75c75d4-5e6d-40b0-831a-4114a94c6c64" path="/var/lib/kubelet/pods/f75c75d4-5e6d-40b0-831a-4114a94c6c64/volumes" Mar 20 17:46:03 crc kubenswrapper[4803]: I0320 17:46:03.419674 4803 generic.go:334] "Generic (PLEG): container finished" podID="9ac9282e-f66c-4b81-9145-e69a7924619a" containerID="8e43ee3e3a82158905f72270aca231bf716c96761bb7c284429872842e0eec29" exitCode=0 Mar 20 17:46:03 crc kubenswrapper[4803]: I0320 17:46:03.419794 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-rkt8v" event={"ID":"9ac9282e-f66c-4b81-9145-e69a7924619a","Type":"ContainerDied","Data":"8e43ee3e3a82158905f72270aca231bf716c96761bb7c284429872842e0eec29"} Mar 20 17:46:04 crc kubenswrapper[4803]: I0320 17:46:04.722383 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-rkt8v" Mar 20 17:46:04 crc kubenswrapper[4803]: I0320 17:46:04.760219 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgwcg\" (UniqueName: \"kubernetes.io/projected/9ac9282e-f66c-4b81-9145-e69a7924619a-kube-api-access-fgwcg\") pod \"9ac9282e-f66c-4b81-9145-e69a7924619a\" (UID: \"9ac9282e-f66c-4b81-9145-e69a7924619a\") " Mar 20 17:46:04 crc kubenswrapper[4803]: I0320 17:46:04.773014 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac9282e-f66c-4b81-9145-e69a7924619a-kube-api-access-fgwcg" (OuterVolumeSpecName: "kube-api-access-fgwcg") pod "9ac9282e-f66c-4b81-9145-e69a7924619a" (UID: "9ac9282e-f66c-4b81-9145-e69a7924619a"). InnerVolumeSpecName "kube-api-access-fgwcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:04 crc kubenswrapper[4803]: I0320 17:46:04.863410 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgwcg\" (UniqueName: \"kubernetes.io/projected/9ac9282e-f66c-4b81-9145-e69a7924619a-kube-api-access-fgwcg\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:05 crc kubenswrapper[4803]: I0320 17:46:05.440997 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567146-rkt8v" event={"ID":"9ac9282e-f66c-4b81-9145-e69a7924619a","Type":"ContainerDied","Data":"b0b457bd0c0db6f6138efdfb5798689e984f9e783e49bb3372f250c6bfab2906"} Mar 20 17:46:05 crc kubenswrapper[4803]: I0320 17:46:05.441058 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b457bd0c0db6f6138efdfb5798689e984f9e783e49bb3372f250c6bfab2906" Mar 20 17:46:05 crc kubenswrapper[4803]: I0320 17:46:05.441421 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567146-rkt8v" Mar 20 17:46:05 crc kubenswrapper[4803]: I0320 17:46:05.805948 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-9xqw6"] Mar 20 17:46:05 crc kubenswrapper[4803]: I0320 17:46:05.817908 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567140-9xqw6"] Mar 20 17:46:06 crc kubenswrapper[4803]: I0320 17:46:06.859065 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8285ef9f-caaf-4e35-a37f-1dbe5914e739" path="/var/lib/kubelet/pods/8285ef9f-caaf-4e35-a37f-1dbe5914e739/volumes" Mar 20 17:46:11 crc kubenswrapper[4803]: I0320 17:46:11.528688 4803 generic.go:334] "Generic (PLEG): container finished" podID="de5b2da1-3a72-4b62-98a5-71352eb71c90" containerID="5a976a793739301c6ab5cb2f8e14e83e3febb20a92c0a31b7374bb7a952cab8e" exitCode=0 Mar 20 17:46:11 crc kubenswrapper[4803]: I0320 17:46:11.528794 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" event={"ID":"de5b2da1-3a72-4b62-98a5-71352eb71c90","Type":"ContainerDied","Data":"5a976a793739301c6ab5cb2f8e14e83e3febb20a92c0a31b7374bb7a952cab8e"} Mar 20 17:46:12 crc kubenswrapper[4803]: I0320 17:46:12.849424 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:46:12 crc kubenswrapper[4803]: E0320 17:46:12.849760 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.052253 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.245270 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brzhn\" (UniqueName: \"kubernetes.io/projected/de5b2da1-3a72-4b62-98a5-71352eb71c90-kube-api-access-brzhn\") pod \"de5b2da1-3a72-4b62-98a5-71352eb71c90\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.245566 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-ssh-key-openstack-edpm-ipam\") pod \"de5b2da1-3a72-4b62-98a5-71352eb71c90\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.245657 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-inventory\") pod \"de5b2da1-3a72-4b62-98a5-71352eb71c90\" (UID: \"de5b2da1-3a72-4b62-98a5-71352eb71c90\") " Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.252121 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5b2da1-3a72-4b62-98a5-71352eb71c90-kube-api-access-brzhn" (OuterVolumeSpecName: "kube-api-access-brzhn") pod "de5b2da1-3a72-4b62-98a5-71352eb71c90" (UID: "de5b2da1-3a72-4b62-98a5-71352eb71c90"). InnerVolumeSpecName "kube-api-access-brzhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.280492 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-inventory" (OuterVolumeSpecName: "inventory") pod "de5b2da1-3a72-4b62-98a5-71352eb71c90" (UID: "de5b2da1-3a72-4b62-98a5-71352eb71c90"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.298139 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de5b2da1-3a72-4b62-98a5-71352eb71c90" (UID: "de5b2da1-3a72-4b62-98a5-71352eb71c90"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.348170 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.348236 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de5b2da1-3a72-4b62-98a5-71352eb71c90-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.348262 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brzhn\" (UniqueName: \"kubernetes.io/projected/de5b2da1-3a72-4b62-98a5-71352eb71c90-kube-api-access-brzhn\") on node \"crc\" DevicePath \"\"" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.555416 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" event={"ID":"de5b2da1-3a72-4b62-98a5-71352eb71c90","Type":"ContainerDied","Data":"b80ad36e55412fe39e9485734c06231e2d63ca0edeba6069ae2c622b13848d10"} Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.555458 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b80ad36e55412fe39e9485734c06231e2d63ca0edeba6069ae2c622b13848d10" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.555609 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-26xq7" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.722581 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh"] Mar 20 17:46:13 crc kubenswrapper[4803]: E0320 17:46:13.722993 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5b2da1-3a72-4b62-98a5-71352eb71c90" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.723010 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5b2da1-3a72-4b62-98a5-71352eb71c90" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:13 crc kubenswrapper[4803]: E0320 17:46:13.723023 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac9282e-f66c-4b81-9145-e69a7924619a" containerName="oc" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.723029 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac9282e-f66c-4b81-9145-e69a7924619a" containerName="oc" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.723222 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac9282e-f66c-4b81-9145-e69a7924619a" containerName="oc" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.723248 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5b2da1-3a72-4b62-98a5-71352eb71c90" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.723880 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.725886 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.726037 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.727246 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.727284 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.732663 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh"] Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.858567 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.859041 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.859100 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt47f\" (UniqueName: \"kubernetes.io/projected/7bb67428-717c-47fc-9ab1-b94a5c502298-kube-api-access-pt47f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.960616 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.960874 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.960937 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt47f\" (UniqueName: \"kubernetes.io/projected/7bb67428-717c-47fc-9ab1-b94a5c502298-kube-api-access-pt47f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.964060 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.964874 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:13 crc kubenswrapper[4803]: I0320 17:46:13.979712 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt47f\" (UniqueName: \"kubernetes.io/projected/7bb67428-717c-47fc-9ab1-b94a5c502298-kube-api-access-pt47f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:14 crc kubenswrapper[4803]: I0320 17:46:14.041346 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:46:14 crc kubenswrapper[4803]: I0320 17:46:14.602024 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh"] Mar 20 17:46:15 crc kubenswrapper[4803]: I0320 17:46:15.573369 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" event={"ID":"7bb67428-717c-47fc-9ab1-b94a5c502298","Type":"ContainerStarted","Data":"e583b96b4d80f2f27ffc70e355a88820aae244901297e1629804f604a46eac31"} Mar 20 17:46:15 crc kubenswrapper[4803]: I0320 17:46:15.573944 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" event={"ID":"7bb67428-717c-47fc-9ab1-b94a5c502298","Type":"ContainerStarted","Data":"f46cebf01621071d6b9723dd5c44c28d24bcd929ba457a5602d82c4e10a9e7cc"} Mar 20 17:46:15 crc kubenswrapper[4803]: I0320 17:46:15.603247 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" podStartSLOduration=2.136885342 podStartE2EDuration="2.603228089s" podCreationTimestamp="2026-03-20 17:46:13 +0000 UTC" firstStartedPulling="2026-03-20 17:46:14.606386103 +0000 UTC m=+1784.517978173" lastFinishedPulling="2026-03-20 17:46:15.07272884 +0000 UTC m=+1784.984320920" observedRunningTime="2026-03-20 17:46:15.589819146 +0000 UTC m=+1785.501411216" watchObservedRunningTime="2026-03-20 17:46:15.603228089 +0000 UTC m=+1785.514820149" Mar 20 17:46:22 crc kubenswrapper[4803]: I0320 17:46:22.964379 4803 scope.go:117] "RemoveContainer" containerID="8c57888648f8a3e10ca03931c4b38c398f05972b9e25aa40b62bd813a26a6b03" Mar 20 17:46:23 crc kubenswrapper[4803]: I0320 17:46:23.008957 4803 scope.go:117] "RemoveContainer" containerID="65dca507c9ef24dc14925fd13203843ca6d0e038d28c89b10b63ce182f014c11" Mar 20 17:46:23 crc kubenswrapper[4803]: I0320 17:46:23.085032 4803 scope.go:117] "RemoveContainer" containerID="4d7d09ed102210b78de59f469432ae289aeec5293a29bb3cc365a921ee93281f" Mar 20 17:46:23 crc kubenswrapper[4803]: I0320 17:46:23.110992 4803 scope.go:117] "RemoveContainer" containerID="dc02b17a721a0e87422aea593e3725e0e205b4241ffe30bf4af60d3195f1f298" Mar 20 17:46:23 crc kubenswrapper[4803]: I0320 17:46:23.165018 4803 scope.go:117] "RemoveContainer" containerID="93eac6cd2687b29e56cb8aa14dd3ff285c9adbd212834dd76ce8b20dddf1148d" Mar 20 17:46:23 crc kubenswrapper[4803]: I0320 17:46:23.207359 4803 scope.go:117] "RemoveContainer" containerID="34f20c260996c5a08bf4fdfa8d1eb8d690278115a62dfd517d0b01139f2496d6" Mar 20 17:46:23 crc kubenswrapper[4803]: I0320 17:46:23.250982 4803 scope.go:117] "RemoveContainer" containerID="bcc0e9cab83123e1afbbb52dd23576fd843d14029b93754d0938f212da70f45e" Mar 20 17:46:24 crc kubenswrapper[4803]: I0320 17:46:24.848213 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:46:24 crc kubenswrapper[4803]: E0320 17:46:24.848603 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:46:34 crc kubenswrapper[4803]: I0320 17:46:34.091083 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jd6h"] Mar 20 17:46:34 crc kubenswrapper[4803]: I0320 17:46:34.098802 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4jd6h"] Mar 20 17:46:34 crc kubenswrapper[4803]: I0320 17:46:34.857899 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9eefa09-fd26-4111-a6e7-605629e9192c" path="/var/lib/kubelet/pods/c9eefa09-fd26-4111-a6e7-605629e9192c/volumes" Mar 20 17:46:36 crc kubenswrapper[4803]: I0320 17:46:36.848113 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:46:36 crc kubenswrapper[4803]: E0320 17:46:36.848709 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:46:49 crc kubenswrapper[4803]: I0320 17:46:49.849323 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:46:49 crc kubenswrapper[4803]: E0320 17:46:49.850635 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:46:50 crc kubenswrapper[4803]: I0320 17:46:50.032697 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vvprx"] Mar 20 17:46:50 crc kubenswrapper[4803]: I0320 17:46:50.043374 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vvprx"] Mar 20 17:46:50 crc kubenswrapper[4803]: I0320 17:46:50.859791 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d2e08fd-0246-49d5-b874-93ba4a2915b6" path="/var/lib/kubelet/pods/6d2e08fd-0246-49d5-b874-93ba4a2915b6/volumes" Mar 20 17:46:52 crc kubenswrapper[4803]: I0320 17:46:52.054203 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkxj8"] Mar 20 17:46:52 crc kubenswrapper[4803]: I0320 17:46:52.064241 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-mkxj8"] Mar 20 17:46:52 crc kubenswrapper[4803]: I0320 17:46:52.859342 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51db7102-cc0d-407f-8d55-2a96de3f6f63" path="/var/lib/kubelet/pods/51db7102-cc0d-407f-8d55-2a96de3f6f63/volumes" Mar 20 17:47:00 crc kubenswrapper[4803]: I0320 17:47:00.857049 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:47:00 crc kubenswrapper[4803]: E0320 17:47:00.859314 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:47:01 crc kubenswrapper[4803]: I0320 17:47:01.005670 4803 generic.go:334] "Generic (PLEG): container finished" podID="7bb67428-717c-47fc-9ab1-b94a5c502298" containerID="e583b96b4d80f2f27ffc70e355a88820aae244901297e1629804f604a46eac31" exitCode=0 Mar 20 17:47:01 crc kubenswrapper[4803]: I0320 17:47:01.005774 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" event={"ID":"7bb67428-717c-47fc-9ab1-b94a5c502298","Type":"ContainerDied","Data":"e583b96b4d80f2f27ffc70e355a88820aae244901297e1629804f604a46eac31"} Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.406858 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.470428 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-inventory\") pod \"7bb67428-717c-47fc-9ab1-b94a5c502298\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.470504 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt47f\" (UniqueName: \"kubernetes.io/projected/7bb67428-717c-47fc-9ab1-b94a5c502298-kube-api-access-pt47f\") pod \"7bb67428-717c-47fc-9ab1-b94a5c502298\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.471346 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-ssh-key-openstack-edpm-ipam\") pod \"7bb67428-717c-47fc-9ab1-b94a5c502298\" (UID: \"7bb67428-717c-47fc-9ab1-b94a5c502298\") " Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.476132 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb67428-717c-47fc-9ab1-b94a5c502298-kube-api-access-pt47f" (OuterVolumeSpecName: "kube-api-access-pt47f") pod "7bb67428-717c-47fc-9ab1-b94a5c502298" (UID: "7bb67428-717c-47fc-9ab1-b94a5c502298"). InnerVolumeSpecName "kube-api-access-pt47f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.499984 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7bb67428-717c-47fc-9ab1-b94a5c502298" (UID: "7bb67428-717c-47fc-9ab1-b94a5c502298"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.500861 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-inventory" (OuterVolumeSpecName: "inventory") pod "7bb67428-717c-47fc-9ab1-b94a5c502298" (UID: "7bb67428-717c-47fc-9ab1-b94a5c502298"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.574574 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.575035 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bb67428-717c-47fc-9ab1-b94a5c502298-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:02 crc kubenswrapper[4803]: I0320 17:47:02.575189 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt47f\" (UniqueName: \"kubernetes.io/projected/7bb67428-717c-47fc-9ab1-b94a5c502298-kube-api-access-pt47f\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.022567 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" event={"ID":"7bb67428-717c-47fc-9ab1-b94a5c502298","Type":"ContainerDied","Data":"f46cebf01621071d6b9723dd5c44c28d24bcd929ba457a5602d82c4e10a9e7cc"} Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.022975 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f46cebf01621071d6b9723dd5c44c28d24bcd929ba457a5602d82c4e10a9e7cc" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.022607 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.116716 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-54x78"] Mar 20 17:47:03 crc kubenswrapper[4803]: E0320 17:47:03.117295 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb67428-717c-47fc-9ab1-b94a5c502298" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.117365 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb67428-717c-47fc-9ab1-b94a5c502298" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.117612 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb67428-717c-47fc-9ab1-b94a5c502298" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.118245 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.120594 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.120835 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.121727 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.125645 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.128042 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-54x78"] Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.186149 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.186217 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m46dl\" (UniqueName: \"kubernetes.io/projected/8a143d0c-3c8c-4426-8b98-1309a281aaf8-kube-api-access-m46dl\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.186308 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.287913 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.288118 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.288168 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m46dl\" (UniqueName: \"kubernetes.io/projected/8a143d0c-3c8c-4426-8b98-1309a281aaf8-kube-api-access-m46dl\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.293736 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.296197 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.308754 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m46dl\" (UniqueName: \"kubernetes.io/projected/8a143d0c-3c8c-4426-8b98-1309a281aaf8-kube-api-access-m46dl\") pod \"ssh-known-hosts-edpm-deployment-54x78\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.437089 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:03 crc kubenswrapper[4803]: I0320 17:47:03.964815 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-54x78"] Mar 20 17:47:03 crc kubenswrapper[4803]: W0320 17:47:03.971385 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a143d0c_3c8c_4426_8b98_1309a281aaf8.slice/crio-2bd535eb5e6c14e9696cf5da683c0fc757564c0f246fd5440831c8bf1cfc4768 WatchSource:0}: Error finding container 2bd535eb5e6c14e9696cf5da683c0fc757564c0f246fd5440831c8bf1cfc4768: Status 404 returned error can't find the container with id 2bd535eb5e6c14e9696cf5da683c0fc757564c0f246fd5440831c8bf1cfc4768 Mar 20 17:47:04 crc kubenswrapper[4803]: I0320 17:47:04.036739 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" event={"ID":"8a143d0c-3c8c-4426-8b98-1309a281aaf8","Type":"ContainerStarted","Data":"2bd535eb5e6c14e9696cf5da683c0fc757564c0f246fd5440831c8bf1cfc4768"} Mar 20 17:47:05 crc kubenswrapper[4803]: I0320 17:47:05.059541 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" event={"ID":"8a143d0c-3c8c-4426-8b98-1309a281aaf8","Type":"ContainerStarted","Data":"0300f148309190c141ffd5172e6247b7fbe034c670c1b5a5cfb3a7b2c29e92a8"} Mar 20 17:47:05 crc kubenswrapper[4803]: I0320 17:47:05.089200 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" podStartSLOduration=1.576665614 podStartE2EDuration="2.089157629s" podCreationTimestamp="2026-03-20 17:47:03 +0000 UTC" firstStartedPulling="2026-03-20 17:47:03.974201347 +0000 UTC m=+1833.885793417" lastFinishedPulling="2026-03-20 17:47:04.486693362 +0000 UTC m=+1834.398285432" observedRunningTime="2026-03-20 17:47:05.082512819 +0000 UTC m=+1834.994104939" watchObservedRunningTime="2026-03-20 17:47:05.089157629 +0000 UTC m=+1835.000749709" Mar 20 17:47:12 crc kubenswrapper[4803]: I0320 17:47:12.137825 4803 generic.go:334] "Generic (PLEG): container finished" podID="8a143d0c-3c8c-4426-8b98-1309a281aaf8" containerID="0300f148309190c141ffd5172e6247b7fbe034c670c1b5a5cfb3a7b2c29e92a8" exitCode=0 Mar 20 17:47:12 crc kubenswrapper[4803]: I0320 17:47:12.137954 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" event={"ID":"8a143d0c-3c8c-4426-8b98-1309a281aaf8","Type":"ContainerDied","Data":"0300f148309190c141ffd5172e6247b7fbe034c670c1b5a5cfb3a7b2c29e92a8"} Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.576827 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.692489 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-inventory-0\") pod \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.692619 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-ssh-key-openstack-edpm-ipam\") pod \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.692642 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m46dl\" (UniqueName: \"kubernetes.io/projected/8a143d0c-3c8c-4426-8b98-1309a281aaf8-kube-api-access-m46dl\") pod \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\" (UID: \"8a143d0c-3c8c-4426-8b98-1309a281aaf8\") " Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.698770 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a143d0c-3c8c-4426-8b98-1309a281aaf8-kube-api-access-m46dl" (OuterVolumeSpecName: "kube-api-access-m46dl") pod "8a143d0c-3c8c-4426-8b98-1309a281aaf8" (UID: "8a143d0c-3c8c-4426-8b98-1309a281aaf8"). InnerVolumeSpecName "kube-api-access-m46dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.719867 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8a143d0c-3c8c-4426-8b98-1309a281aaf8" (UID: "8a143d0c-3c8c-4426-8b98-1309a281aaf8"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.728869 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8a143d0c-3c8c-4426-8b98-1309a281aaf8" (UID: "8a143d0c-3c8c-4426-8b98-1309a281aaf8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.796246 4803 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.796309 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m46dl\" (UniqueName: \"kubernetes.io/projected/8a143d0c-3c8c-4426-8b98-1309a281aaf8-kube-api-access-m46dl\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:13 crc kubenswrapper[4803]: I0320 17:47:13.796327 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8a143d0c-3c8c-4426-8b98-1309a281aaf8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.168431 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" event={"ID":"8a143d0c-3c8c-4426-8b98-1309a281aaf8","Type":"ContainerDied","Data":"2bd535eb5e6c14e9696cf5da683c0fc757564c0f246fd5440831c8bf1cfc4768"} Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.168474 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bd535eb5e6c14e9696cf5da683c0fc757564c0f246fd5440831c8bf1cfc4768" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.168568 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-54x78" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.264387 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h"] Mar 20 17:47:14 crc kubenswrapper[4803]: E0320 17:47:14.264905 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a143d0c-3c8c-4426-8b98-1309a281aaf8" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.264927 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a143d0c-3c8c-4426-8b98-1309a281aaf8" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.265219 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a143d0c-3c8c-4426-8b98-1309a281aaf8" containerName="ssh-known-hosts-edpm-deployment" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.265993 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.271359 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h"] Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.294096 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.294465 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.294771 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.295262 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.417089 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.417290 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpptw\" (UniqueName: \"kubernetes.io/projected/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-kube-api-access-xpptw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.417466 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.519100 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.519582 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpptw\" (UniqueName: \"kubernetes.io/projected/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-kube-api-access-xpptw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.519653 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.525410 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.526072 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.551405 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpptw\" (UniqueName: \"kubernetes.io/projected/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-kube-api-access-xpptw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gfh5h\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.604492 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:14 crc kubenswrapper[4803]: I0320 17:47:14.967478 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h"] Mar 20 17:47:15 crc kubenswrapper[4803]: I0320 17:47:15.180127 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" event={"ID":"2aed2dd6-8792-412c-972f-6e7e4e4bae0e","Type":"ContainerStarted","Data":"cc2be75a04256c9e3fcd784b4d585417b4c27ceb2403386bc20068c42be52e20"} Mar 20 17:47:15 crc kubenswrapper[4803]: I0320 17:47:15.848493 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:47:15 crc kubenswrapper[4803]: E0320 17:47:15.849239 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:47:16 crc kubenswrapper[4803]: I0320 17:47:16.189860 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" event={"ID":"2aed2dd6-8792-412c-972f-6e7e4e4bae0e","Type":"ContainerStarted","Data":"06a25da7a992afa03a4b139382b7e2289eaed02902591110883425f921d92b97"} Mar 20 17:47:16 crc kubenswrapper[4803]: I0320 17:47:16.215439 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" podStartSLOduration=1.777103918 podStartE2EDuration="2.215415374s" podCreationTimestamp="2026-03-20 17:47:14 +0000 UTC" firstStartedPulling="2026-03-20 17:47:14.96916926 +0000 UTC m=+1844.880761330" lastFinishedPulling="2026-03-20 17:47:15.407480676 +0000 UTC m=+1845.319072786" observedRunningTime="2026-03-20 17:47:16.211536103 +0000 UTC m=+1846.123128193" watchObservedRunningTime="2026-03-20 17:47:16.215415374 +0000 UTC m=+1846.127007464" Mar 20 17:47:23 crc kubenswrapper[4803]: I0320 17:47:23.411025 4803 scope.go:117] "RemoveContainer" containerID="90e5c3c3e8794fc7317ca12d7bca72a137376bf9defdbc91d127b2f67d471087" Mar 20 17:47:23 crc kubenswrapper[4803]: I0320 17:47:23.448451 4803 scope.go:117] "RemoveContainer" containerID="0c2492519b69d5e624e7e1bfcdac8a3488889d3241da936111a3347141004dd1" Mar 20 17:47:23 crc kubenswrapper[4803]: I0320 17:47:23.523744 4803 scope.go:117] "RemoveContainer" containerID="a62ea244ef295ac92f292c6d911909f44bc278ede51fdc5861157b3b71b73e41" Mar 20 17:47:24 crc kubenswrapper[4803]: I0320 17:47:24.273328 4803 generic.go:334] "Generic (PLEG): container finished" podID="2aed2dd6-8792-412c-972f-6e7e4e4bae0e" containerID="06a25da7a992afa03a4b139382b7e2289eaed02902591110883425f921d92b97" exitCode=0 Mar 20 17:47:24 crc kubenswrapper[4803]: I0320 17:47:24.273369 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" event={"ID":"2aed2dd6-8792-412c-972f-6e7e4e4bae0e","Type":"ContainerDied","Data":"06a25da7a992afa03a4b139382b7e2289eaed02902591110883425f921d92b97"} Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.710439 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.837635 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpptw\" (UniqueName: \"kubernetes.io/projected/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-kube-api-access-xpptw\") pod \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.837715 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-inventory\") pod \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.837845 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-ssh-key-openstack-edpm-ipam\") pod \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\" (UID: \"2aed2dd6-8792-412c-972f-6e7e4e4bae0e\") " Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.847726 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-kube-api-access-xpptw" (OuterVolumeSpecName: "kube-api-access-xpptw") pod "2aed2dd6-8792-412c-972f-6e7e4e4bae0e" (UID: "2aed2dd6-8792-412c-972f-6e7e4e4bae0e"). InnerVolumeSpecName "kube-api-access-xpptw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.865142 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2aed2dd6-8792-412c-972f-6e7e4e4bae0e" (UID: "2aed2dd6-8792-412c-972f-6e7e4e4bae0e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.865633 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-inventory" (OuterVolumeSpecName: "inventory") pod "2aed2dd6-8792-412c-972f-6e7e4e4bae0e" (UID: "2aed2dd6-8792-412c-972f-6e7e4e4bae0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.940863 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.940914 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpptw\" (UniqueName: \"kubernetes.io/projected/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-kube-api-access-xpptw\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:25 crc kubenswrapper[4803]: I0320 17:47:25.940938 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2aed2dd6-8792-412c-972f-6e7e4e4bae0e-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.293148 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" event={"ID":"2aed2dd6-8792-412c-972f-6e7e4e4bae0e","Type":"ContainerDied","Data":"cc2be75a04256c9e3fcd784b4d585417b4c27ceb2403386bc20068c42be52e20"} Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.293201 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gfh5h" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.293219 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc2be75a04256c9e3fcd784b4d585417b4c27ceb2403386bc20068c42be52e20" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.367553 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp"] Mar 20 17:47:26 crc kubenswrapper[4803]: E0320 17:47:26.368056 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aed2dd6-8792-412c-972f-6e7e4e4bae0e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.368083 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aed2dd6-8792-412c-972f-6e7e4e4bae0e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.368314 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aed2dd6-8792-412c-972f-6e7e4e4bae0e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.369073 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.370964 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.371211 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.372277 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.373096 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.394747 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp"] Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.450136 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95qvr\" (UniqueName: \"kubernetes.io/projected/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-kube-api-access-95qvr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.450218 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.450269 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.552246 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.552407 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95qvr\" (UniqueName: \"kubernetes.io/projected/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-kube-api-access-95qvr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.552503 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.557131 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.566065 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.592423 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95qvr\" (UniqueName: \"kubernetes.io/projected/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-kube-api-access-95qvr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:26 crc kubenswrapper[4803]: I0320 17:47:26.713091 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:27 crc kubenswrapper[4803]: I0320 17:47:27.209736 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp"] Mar 20 17:47:27 crc kubenswrapper[4803]: I0320 17:47:27.305704 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" event={"ID":"6091d3e2-8164-4c5c-b5b9-9299dbe203d5","Type":"ContainerStarted","Data":"042da326b20de48c1853ed3a8e31eeea161dd2267ed94cbb739edcef340f08fe"} Mar 20 17:47:28 crc kubenswrapper[4803]: I0320 17:47:28.316541 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" event={"ID":"6091d3e2-8164-4c5c-b5b9-9299dbe203d5","Type":"ContainerStarted","Data":"f158dc7fda0e1e183bb3b77338e2c60a2e190c155caf58e66650bce2ac816c87"} Mar 20 17:47:28 crc kubenswrapper[4803]: I0320 17:47:28.344091 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" podStartSLOduration=1.742136612 podStartE2EDuration="2.344059652s" podCreationTimestamp="2026-03-20 17:47:26 +0000 UTC" firstStartedPulling="2026-03-20 17:47:27.211042296 +0000 UTC m=+1857.122634376" lastFinishedPulling="2026-03-20 17:47:27.812965346 +0000 UTC m=+1857.724557416" observedRunningTime="2026-03-20 17:47:28.329576969 +0000 UTC m=+1858.241169079" watchObservedRunningTime="2026-03-20 17:47:28.344059652 +0000 UTC m=+1858.255651762" Mar 20 17:47:28 crc kubenswrapper[4803]: I0320 17:47:28.847785 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:47:28 crc kubenswrapper[4803]: E0320 17:47:28.848081 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:47:36 crc kubenswrapper[4803]: I0320 17:47:36.051450 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-v5q8f"] Mar 20 17:47:36 crc kubenswrapper[4803]: I0320 17:47:36.060300 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-v5q8f"] Mar 20 17:47:36 crc kubenswrapper[4803]: I0320 17:47:36.863880 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ac9cf1-3fd1-49db-a10d-eedb31f9c82d" path="/var/lib/kubelet/pods/12ac9cf1-3fd1-49db-a10d-eedb31f9c82d/volumes" Mar 20 17:47:37 crc kubenswrapper[4803]: I0320 17:47:37.404410 4803 generic.go:334] "Generic (PLEG): container finished" podID="6091d3e2-8164-4c5c-b5b9-9299dbe203d5" containerID="f158dc7fda0e1e183bb3b77338e2c60a2e190c155caf58e66650bce2ac816c87" exitCode=0 Mar 20 17:47:37 crc kubenswrapper[4803]: I0320 17:47:37.404539 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" event={"ID":"6091d3e2-8164-4c5c-b5b9-9299dbe203d5","Type":"ContainerDied","Data":"f158dc7fda0e1e183bb3b77338e2c60a2e190c155caf58e66650bce2ac816c87"} Mar 20 17:47:38 crc kubenswrapper[4803]: I0320 17:47:38.898058 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:38 crc kubenswrapper[4803]: I0320 17:47:38.986852 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95qvr\" (UniqueName: \"kubernetes.io/projected/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-kube-api-access-95qvr\") pod \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " Mar 20 17:47:38 crc kubenswrapper[4803]: I0320 17:47:38.987003 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-ssh-key-openstack-edpm-ipam\") pod \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " Mar 20 17:47:38 crc kubenswrapper[4803]: I0320 17:47:38.987115 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-inventory\") pod \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\" (UID: \"6091d3e2-8164-4c5c-b5b9-9299dbe203d5\") " Mar 20 17:47:38 crc kubenswrapper[4803]: I0320 17:47:38.991724 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-kube-api-access-95qvr" (OuterVolumeSpecName: "kube-api-access-95qvr") pod "6091d3e2-8164-4c5c-b5b9-9299dbe203d5" (UID: "6091d3e2-8164-4c5c-b5b9-9299dbe203d5"). InnerVolumeSpecName "kube-api-access-95qvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.011740 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-inventory" (OuterVolumeSpecName: "inventory") pod "6091d3e2-8164-4c5c-b5b9-9299dbe203d5" (UID: "6091d3e2-8164-4c5c-b5b9-9299dbe203d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.011928 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6091d3e2-8164-4c5c-b5b9-9299dbe203d5" (UID: "6091d3e2-8164-4c5c-b5b9-9299dbe203d5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.090210 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.090247 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.090259 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95qvr\" (UniqueName: \"kubernetes.io/projected/6091d3e2-8164-4c5c-b5b9-9299dbe203d5-kube-api-access-95qvr\") on node \"crc\" DevicePath \"\"" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.426062 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" event={"ID":"6091d3e2-8164-4c5c-b5b9-9299dbe203d5","Type":"ContainerDied","Data":"042da326b20de48c1853ed3a8e31eeea161dd2267ed94cbb739edcef340f08fe"} Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.426118 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="042da326b20de48c1853ed3a8e31eeea161dd2267ed94cbb739edcef340f08fe" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.426180 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.533359 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf"] Mar 20 17:47:39 crc kubenswrapper[4803]: E0320 17:47:39.534000 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6091d3e2-8164-4c5c-b5b9-9299dbe203d5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.534015 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6091d3e2-8164-4c5c-b5b9-9299dbe203d5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.534174 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6091d3e2-8164-4c5c-b5b9-9299dbe203d5" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.534776 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.539459 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.544844 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.544903 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.544978 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.545289 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.545306 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.545314 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.545623 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.566141 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf"] Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628363 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628408 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628438 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628618 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628688 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628721 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628779 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628885 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.628976 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.629026 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.629149 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.629214 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.629296 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.629416 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mk6x\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-kube-api-access-2mk6x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.731745 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732139 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732183 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732225 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732280 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mk6x\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-kube-api-access-2mk6x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732411 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732453 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732500 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732599 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732656 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732694 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732764 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732838 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.732908 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.737396 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.738370 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.739375 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.739563 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.739676 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.739863 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.741016 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.742180 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.742678 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.743667 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.743699 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.744319 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.744858 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.754135 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mk6x\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-kube-api-access-2mk6x\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:39 crc kubenswrapper[4803]: I0320 17:47:39.856331 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:47:40 crc kubenswrapper[4803]: I0320 17:47:40.357281 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf"] Mar 20 17:47:40 crc kubenswrapper[4803]: W0320 17:47:40.368732 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11eacc86_5400_4614_bfa1_353a4c9a4ef8.slice/crio-c2cb6535aca6954259d7abdf7323243933b74436ca9286135d8669c2f65c59a6 WatchSource:0}: Error finding container c2cb6535aca6954259d7abdf7323243933b74436ca9286135d8669c2f65c59a6: Status 404 returned error can't find the container with id c2cb6535aca6954259d7abdf7323243933b74436ca9286135d8669c2f65c59a6 Mar 20 17:47:40 crc kubenswrapper[4803]: I0320 17:47:40.434099 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" event={"ID":"11eacc86-5400-4614-bfa1-353a4c9a4ef8","Type":"ContainerStarted","Data":"c2cb6535aca6954259d7abdf7323243933b74436ca9286135d8669c2f65c59a6"} Mar 20 17:47:41 crc kubenswrapper[4803]: I0320 17:47:41.448686 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" event={"ID":"11eacc86-5400-4614-bfa1-353a4c9a4ef8","Type":"ContainerStarted","Data":"07738d151898bf4c3973c712b27ff6d495e689f6ceb584b299274f2c3f5d8eb7"} Mar 20 17:47:41 crc kubenswrapper[4803]: I0320 17:47:41.479382 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" podStartSLOduration=1.913465257 podStartE2EDuration="2.479364218s" podCreationTimestamp="2026-03-20 17:47:39 +0000 UTC" firstStartedPulling="2026-03-20 17:47:40.370659725 +0000 UTC m=+1870.282251815" lastFinishedPulling="2026-03-20 17:47:40.936558706 +0000 UTC m=+1870.848150776" observedRunningTime="2026-03-20 17:47:41.4735011 +0000 UTC m=+1871.385093180" watchObservedRunningTime="2026-03-20 17:47:41.479364218 +0000 UTC m=+1871.390956288" Mar 20 17:47:41 crc kubenswrapper[4803]: I0320 17:47:41.848763 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:47:42 crc kubenswrapper[4803]: I0320 17:47:42.460620 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"59c8765f1a44e70897205c8cc4ba9fd464249da755c0e1d0c4f525a2371cd37d"} Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.136140 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567148-lbscd"] Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.137992 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-lbscd" Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.142809 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.143463 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.142834 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.150877 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-lbscd"] Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.261794 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7fp\" (UniqueName: \"kubernetes.io/projected/ac7e04c6-45d6-4f90-ba80-b9d56303fccb-kube-api-access-6z7fp\") pod \"auto-csr-approver-29567148-lbscd\" (UID: \"ac7e04c6-45d6-4f90-ba80-b9d56303fccb\") " pod="openshift-infra/auto-csr-approver-29567148-lbscd" Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.364166 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7fp\" (UniqueName: \"kubernetes.io/projected/ac7e04c6-45d6-4f90-ba80-b9d56303fccb-kube-api-access-6z7fp\") pod \"auto-csr-approver-29567148-lbscd\" (UID: \"ac7e04c6-45d6-4f90-ba80-b9d56303fccb\") " pod="openshift-infra/auto-csr-approver-29567148-lbscd" Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.382332 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7fp\" (UniqueName: \"kubernetes.io/projected/ac7e04c6-45d6-4f90-ba80-b9d56303fccb-kube-api-access-6z7fp\") pod \"auto-csr-approver-29567148-lbscd\" (UID: \"ac7e04c6-45d6-4f90-ba80-b9d56303fccb\") " pod="openshift-infra/auto-csr-approver-29567148-lbscd" Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.464025 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-lbscd" Mar 20 17:48:00 crc kubenswrapper[4803]: I0320 17:48:00.930908 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-lbscd"] Mar 20 17:48:01 crc kubenswrapper[4803]: I0320 17:48:01.714321 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-lbscd" event={"ID":"ac7e04c6-45d6-4f90-ba80-b9d56303fccb","Type":"ContainerStarted","Data":"b5ddd68202b36650147ca5f4b0aa1521a2b114c1c255a80f4353f954d1603356"} Mar 20 17:48:02 crc kubenswrapper[4803]: I0320 17:48:02.722877 4803 generic.go:334] "Generic (PLEG): container finished" podID="ac7e04c6-45d6-4f90-ba80-b9d56303fccb" containerID="d599b5fc8384c0b39acfc0f027de3e6589dc2e228a077a9e4ada5edf8689b8d8" exitCode=0 Mar 20 17:48:02 crc kubenswrapper[4803]: I0320 17:48:02.722944 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-lbscd" event={"ID":"ac7e04c6-45d6-4f90-ba80-b9d56303fccb","Type":"ContainerDied","Data":"d599b5fc8384c0b39acfc0f027de3e6589dc2e228a077a9e4ada5edf8689b8d8"} Mar 20 17:48:04 crc kubenswrapper[4803]: I0320 17:48:04.054770 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-lbscd" Mar 20 17:48:04 crc kubenswrapper[4803]: I0320 17:48:04.142203 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z7fp\" (UniqueName: \"kubernetes.io/projected/ac7e04c6-45d6-4f90-ba80-b9d56303fccb-kube-api-access-6z7fp\") pod \"ac7e04c6-45d6-4f90-ba80-b9d56303fccb\" (UID: \"ac7e04c6-45d6-4f90-ba80-b9d56303fccb\") " Mar 20 17:48:04 crc kubenswrapper[4803]: I0320 17:48:04.147903 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac7e04c6-45d6-4f90-ba80-b9d56303fccb-kube-api-access-6z7fp" (OuterVolumeSpecName: "kube-api-access-6z7fp") pod "ac7e04c6-45d6-4f90-ba80-b9d56303fccb" (UID: "ac7e04c6-45d6-4f90-ba80-b9d56303fccb"). InnerVolumeSpecName "kube-api-access-6z7fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:04 crc kubenswrapper[4803]: I0320 17:48:04.244950 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z7fp\" (UniqueName: \"kubernetes.io/projected/ac7e04c6-45d6-4f90-ba80-b9d56303fccb-kube-api-access-6z7fp\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:04 crc kubenswrapper[4803]: I0320 17:48:04.757237 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567148-lbscd" event={"ID":"ac7e04c6-45d6-4f90-ba80-b9d56303fccb","Type":"ContainerDied","Data":"b5ddd68202b36650147ca5f4b0aa1521a2b114c1c255a80f4353f954d1603356"} Mar 20 17:48:04 crc kubenswrapper[4803]: I0320 17:48:04.757288 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5ddd68202b36650147ca5f4b0aa1521a2b114c1c255a80f4353f954d1603356" Mar 20 17:48:04 crc kubenswrapper[4803]: I0320 17:48:04.757301 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567148-lbscd" Mar 20 17:48:05 crc kubenswrapper[4803]: I0320 17:48:05.125745 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-wh7rr"] Mar 20 17:48:05 crc kubenswrapper[4803]: I0320 17:48:05.133222 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567142-wh7rr"] Mar 20 17:48:06 crc kubenswrapper[4803]: I0320 17:48:06.858577 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7" path="/var/lib/kubelet/pods/b0f9fcbc-ce98-4e3d-89b8-6d209fc924e7/volumes" Mar 20 17:48:16 crc kubenswrapper[4803]: I0320 17:48:16.866576 4803 generic.go:334] "Generic (PLEG): container finished" podID="11eacc86-5400-4614-bfa1-353a4c9a4ef8" containerID="07738d151898bf4c3973c712b27ff6d495e689f6ceb584b299274f2c3f5d8eb7" exitCode=0 Mar 20 17:48:16 crc kubenswrapper[4803]: I0320 17:48:16.866659 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" event={"ID":"11eacc86-5400-4614-bfa1-353a4c9a4ef8","Type":"ContainerDied","Data":"07738d151898bf4c3973c712b27ff6d495e689f6ceb584b299274f2c3f5d8eb7"} Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.283504 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.327869 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328268 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ssh-key-openstack-edpm-ipam\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328326 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ovn-combined-ca-bundle\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328367 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mk6x\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-kube-api-access-2mk6x\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328481 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-nova-combined-ca-bundle\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328563 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-repo-setup-combined-ca-bundle\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328613 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328656 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-neutron-metadata-combined-ca-bundle\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328757 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-inventory\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328813 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-libvirt-combined-ca-bundle\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328849 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328890 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-telemetry-combined-ca-bundle\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328936 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-bootstrap-combined-ca-bundle\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.328996 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\" (UID: \"11eacc86-5400-4614-bfa1-353a4c9a4ef8\") " Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.339094 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.339120 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.339159 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.339345 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-kube-api-access-2mk6x" (OuterVolumeSpecName: "kube-api-access-2mk6x") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "kube-api-access-2mk6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.339450 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.339485 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.340235 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.340801 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.341278 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.341420 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.341894 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.344893 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.363641 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-inventory" (OuterVolumeSpecName: "inventory") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.381271 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "11eacc86-5400-4614-bfa1-353a4c9a4ef8" (UID: "11eacc86-5400-4614-bfa1-353a4c9a4ef8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431673 4803 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431712 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mk6x\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-kube-api-access-2mk6x\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431737 4803 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431749 4803 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431766 4803 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431779 4803 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431793 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431803 4803 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431812 4803 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431821 4803 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431829 4803 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431837 4803 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431848 4803 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/11eacc86-5400-4614-bfa1-353a4c9a4ef8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.431861 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/11eacc86-5400-4614-bfa1-353a4c9a4ef8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.890987 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" event={"ID":"11eacc86-5400-4614-bfa1-353a4c9a4ef8","Type":"ContainerDied","Data":"c2cb6535aca6954259d7abdf7323243933b74436ca9286135d8669c2f65c59a6"} Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.891037 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2cb6535aca6954259d7abdf7323243933b74436ca9286135d8669c2f65c59a6" Mar 20 17:48:18 crc kubenswrapper[4803]: I0320 17:48:18.891436 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.013695 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc"] Mar 20 17:48:19 crc kubenswrapper[4803]: E0320 17:48:19.014208 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac7e04c6-45d6-4f90-ba80-b9d56303fccb" containerName="oc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.014234 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac7e04c6-45d6-4f90-ba80-b9d56303fccb" containerName="oc" Mar 20 17:48:19 crc kubenswrapper[4803]: E0320 17:48:19.014272 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11eacc86-5400-4614-bfa1-353a4c9a4ef8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.014286 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="11eacc86-5400-4614-bfa1-353a4c9a4ef8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.014644 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="11eacc86-5400-4614-bfa1-353a4c9a4ef8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.014676 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac7e04c6-45d6-4f90-ba80-b9d56303fccb" containerName="oc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.015653 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.018026 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.018336 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.018763 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.020935 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.021231 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.061444 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.061501 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.061575 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.061660 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.061696 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tx2p\" (UniqueName: \"kubernetes.io/projected/6bd23d21-aeac-4394-b83d-befbd825d5ce-kube-api-access-5tx2p\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.076112 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc"] Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.163405 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.163513 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.163549 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tx2p\" (UniqueName: \"kubernetes.io/projected/6bd23d21-aeac-4394-b83d-befbd825d5ce-kube-api-access-5tx2p\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.163637 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.163664 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.164849 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.169594 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.170218 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.172974 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.183884 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tx2p\" (UniqueName: \"kubernetes.io/projected/6bd23d21-aeac-4394-b83d-befbd825d5ce-kube-api-access-5tx2p\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-knxqc\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.377616 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:48:19 crc kubenswrapper[4803]: I0320 17:48:19.941915 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc"] Mar 20 17:48:20 crc kubenswrapper[4803]: I0320 17:48:20.920262 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" event={"ID":"6bd23d21-aeac-4394-b83d-befbd825d5ce","Type":"ContainerStarted","Data":"066a9e508f5dfa6c5ec5e53f25987d7686a489f5603cfd9548f27c2f915613e4"} Mar 20 17:48:20 crc kubenswrapper[4803]: I0320 17:48:20.920614 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" event={"ID":"6bd23d21-aeac-4394-b83d-befbd825d5ce","Type":"ContainerStarted","Data":"6df3bf91449a13aafd706f0ff26675b3531e6da446e748f83384f50aab3512cf"} Mar 20 17:48:20 crc kubenswrapper[4803]: I0320 17:48:20.944080 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" podStartSLOduration=2.52450078 podStartE2EDuration="2.9440542s" podCreationTimestamp="2026-03-20 17:48:18 +0000 UTC" firstStartedPulling="2026-03-20 17:48:19.950610482 +0000 UTC m=+1909.862202592" lastFinishedPulling="2026-03-20 17:48:20.370163932 +0000 UTC m=+1910.281756012" observedRunningTime="2026-03-20 17:48:20.933941741 +0000 UTC m=+1910.845533831" watchObservedRunningTime="2026-03-20 17:48:20.9440542 +0000 UTC m=+1910.855646320" Mar 20 17:48:23 crc kubenswrapper[4803]: I0320 17:48:23.611431 4803 scope.go:117] "RemoveContainer" containerID="c7f1a559893fc902708f548cb281df3a3e3d7b05d13531135b05238e6580e6b9" Mar 20 17:48:23 crc kubenswrapper[4803]: I0320 17:48:23.658604 4803 scope.go:117] "RemoveContainer" containerID="ccc886527813a854c8522a756637de08cff00151ac71b7666ffe61f174094326" Mar 20 17:49:22 crc kubenswrapper[4803]: I0320 17:49:22.520844 4803 generic.go:334] "Generic (PLEG): container finished" podID="6bd23d21-aeac-4394-b83d-befbd825d5ce" containerID="066a9e508f5dfa6c5ec5e53f25987d7686a489f5603cfd9548f27c2f915613e4" exitCode=0 Mar 20 17:49:22 crc kubenswrapper[4803]: I0320 17:49:22.520942 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" event={"ID":"6bd23d21-aeac-4394-b83d-befbd825d5ce","Type":"ContainerDied","Data":"066a9e508f5dfa6c5ec5e53f25987d7686a489f5603cfd9548f27c2f915613e4"} Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.005790 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.088878 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovn-combined-ca-bundle\") pod \"6bd23d21-aeac-4394-b83d-befbd825d5ce\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.089245 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovncontroller-config-0\") pod \"6bd23d21-aeac-4394-b83d-befbd825d5ce\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.089381 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ssh-key-openstack-edpm-ipam\") pod \"6bd23d21-aeac-4394-b83d-befbd825d5ce\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.089485 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tx2p\" (UniqueName: \"kubernetes.io/projected/6bd23d21-aeac-4394-b83d-befbd825d5ce-kube-api-access-5tx2p\") pod \"6bd23d21-aeac-4394-b83d-befbd825d5ce\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.089604 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-inventory\") pod \"6bd23d21-aeac-4394-b83d-befbd825d5ce\" (UID: \"6bd23d21-aeac-4394-b83d-befbd825d5ce\") " Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.094103 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd23d21-aeac-4394-b83d-befbd825d5ce-kube-api-access-5tx2p" (OuterVolumeSpecName: "kube-api-access-5tx2p") pod "6bd23d21-aeac-4394-b83d-befbd825d5ce" (UID: "6bd23d21-aeac-4394-b83d-befbd825d5ce"). InnerVolumeSpecName "kube-api-access-5tx2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.094227 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6bd23d21-aeac-4394-b83d-befbd825d5ce" (UID: "6bd23d21-aeac-4394-b83d-befbd825d5ce"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.126768 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6bd23d21-aeac-4394-b83d-befbd825d5ce" (UID: "6bd23d21-aeac-4394-b83d-befbd825d5ce"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.130980 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-inventory" (OuterVolumeSpecName: "inventory") pod "6bd23d21-aeac-4394-b83d-befbd825d5ce" (UID: "6bd23d21-aeac-4394-b83d-befbd825d5ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.131295 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6bd23d21-aeac-4394-b83d-befbd825d5ce" (UID: "6bd23d21-aeac-4394-b83d-befbd825d5ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.191487 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.191730 4803 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.191803 4803 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6bd23d21-aeac-4394-b83d-befbd825d5ce-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.191869 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6bd23d21-aeac-4394-b83d-befbd825d5ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.191925 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tx2p\" (UniqueName: \"kubernetes.io/projected/6bd23d21-aeac-4394-b83d-befbd825d5ce-kube-api-access-5tx2p\") on node \"crc\" DevicePath \"\"" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.542830 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" event={"ID":"6bd23d21-aeac-4394-b83d-befbd825d5ce","Type":"ContainerDied","Data":"6df3bf91449a13aafd706f0ff26675b3531e6da446e748f83384f50aab3512cf"} Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.542883 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6df3bf91449a13aafd706f0ff26675b3531e6da446e748f83384f50aab3512cf" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.542960 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-knxqc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.663416 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc"] Mar 20 17:49:24 crc kubenswrapper[4803]: E0320 17:49:24.665138 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd23d21-aeac-4394-b83d-befbd825d5ce" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.665468 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd23d21-aeac-4394-b83d-befbd825d5ce" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.665706 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd23d21-aeac-4394-b83d-befbd825d5ce" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.666480 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.670858 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.671396 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.671434 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.671750 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.671814 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.671922 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.682793 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc"] Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.804092 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dpmk\" (UniqueName: \"kubernetes.io/projected/6aabf807-6d9b-4e6a-99db-ddfda9979b25-kube-api-access-2dpmk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.804269 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.804336 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.804513 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.804742 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.804863 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.906802 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.906868 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.906915 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dpmk\" (UniqueName: \"kubernetes.io/projected/6aabf807-6d9b-4e6a-99db-ddfda9979b25-kube-api-access-2dpmk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.906970 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.906995 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.907033 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.912400 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.918149 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.918323 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.918400 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.918839 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.925294 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dpmk\" (UniqueName: \"kubernetes.io/projected/6aabf807-6d9b-4e6a-99db-ddfda9979b25-kube-api-access-2dpmk\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:24 crc kubenswrapper[4803]: I0320 17:49:24.984764 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:49:25 crc kubenswrapper[4803]: I0320 17:49:25.301885 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc"] Mar 20 17:49:25 crc kubenswrapper[4803]: I0320 17:49:25.552572 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" event={"ID":"6aabf807-6d9b-4e6a-99db-ddfda9979b25","Type":"ContainerStarted","Data":"902bb55df29262e31889139f4fb1eb9fe6a8acc939cc22b3fe5bb615630d792d"} Mar 20 17:49:26 crc kubenswrapper[4803]: I0320 17:49:26.564210 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" event={"ID":"6aabf807-6d9b-4e6a-99db-ddfda9979b25","Type":"ContainerStarted","Data":"7bba28bddcbe3b49c4e46a663c38fdd1864b4c5e4b84ec3357d2c3ceb22c5dc5"} Mar 20 17:49:26 crc kubenswrapper[4803]: I0320 17:49:26.590302 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" podStartSLOduration=2.109205206 podStartE2EDuration="2.590282347s" podCreationTimestamp="2026-03-20 17:49:24 +0000 UTC" firstStartedPulling="2026-03-20 17:49:25.307688391 +0000 UTC m=+1975.219280471" lastFinishedPulling="2026-03-20 17:49:25.788765542 +0000 UTC m=+1975.700357612" observedRunningTime="2026-03-20 17:49:26.585950973 +0000 UTC m=+1976.497543063" watchObservedRunningTime="2026-03-20 17:49:26.590282347 +0000 UTC m=+1976.501874437" Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.144902 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567150-b897s"] Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.146857 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-b897s" Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.149739 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.149861 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.150021 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.155196 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-b897s"] Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.338404 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pgpv\" (UniqueName: \"kubernetes.io/projected/f3f88542-c200-48c2-a14e-ced4a4f7be3d-kube-api-access-7pgpv\") pod \"auto-csr-approver-29567150-b897s\" (UID: \"f3f88542-c200-48c2-a14e-ced4a4f7be3d\") " pod="openshift-infra/auto-csr-approver-29567150-b897s" Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.439914 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pgpv\" (UniqueName: \"kubernetes.io/projected/f3f88542-c200-48c2-a14e-ced4a4f7be3d-kube-api-access-7pgpv\") pod \"auto-csr-approver-29567150-b897s\" (UID: \"f3f88542-c200-48c2-a14e-ced4a4f7be3d\") " pod="openshift-infra/auto-csr-approver-29567150-b897s" Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.458176 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pgpv\" (UniqueName: \"kubernetes.io/projected/f3f88542-c200-48c2-a14e-ced4a4f7be3d-kube-api-access-7pgpv\") pod \"auto-csr-approver-29567150-b897s\" (UID: \"f3f88542-c200-48c2-a14e-ced4a4f7be3d\") " pod="openshift-infra/auto-csr-approver-29567150-b897s" Mar 20 17:50:00 crc kubenswrapper[4803]: I0320 17:50:00.661770 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-b897s" Mar 20 17:50:01 crc kubenswrapper[4803]: I0320 17:50:01.086741 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-b897s"] Mar 20 17:50:01 crc kubenswrapper[4803]: I0320 17:50:01.902586 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-b897s" event={"ID":"f3f88542-c200-48c2-a14e-ced4a4f7be3d","Type":"ContainerStarted","Data":"2c87ccaa854282be335e1023729cb945a919a88235d3ddf0518372454cc88e17"} Mar 20 17:50:06 crc kubenswrapper[4803]: I0320 17:50:06.244355 4803 generic.go:334] "Generic (PLEG): container finished" podID="f3f88542-c200-48c2-a14e-ced4a4f7be3d" containerID="07a5ad5600f6f2b247631623210160a6db220194a247797795dd807a2698b0d5" exitCode=0 Mar 20 17:50:06 crc kubenswrapper[4803]: I0320 17:50:06.244405 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-b897s" event={"ID":"f3f88542-c200-48c2-a14e-ced4a4f7be3d","Type":"ContainerDied","Data":"07a5ad5600f6f2b247631623210160a6db220194a247797795dd807a2698b0d5"} Mar 20 17:50:07 crc kubenswrapper[4803]: I0320 17:50:07.583744 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-b897s" Mar 20 17:50:07 crc kubenswrapper[4803]: I0320 17:50:07.709002 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pgpv\" (UniqueName: \"kubernetes.io/projected/f3f88542-c200-48c2-a14e-ced4a4f7be3d-kube-api-access-7pgpv\") pod \"f3f88542-c200-48c2-a14e-ced4a4f7be3d\" (UID: \"f3f88542-c200-48c2-a14e-ced4a4f7be3d\") " Mar 20 17:50:07 crc kubenswrapper[4803]: I0320 17:50:07.714408 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3f88542-c200-48c2-a14e-ced4a4f7be3d-kube-api-access-7pgpv" (OuterVolumeSpecName: "kube-api-access-7pgpv") pod "f3f88542-c200-48c2-a14e-ced4a4f7be3d" (UID: "f3f88542-c200-48c2-a14e-ced4a4f7be3d"). InnerVolumeSpecName "kube-api-access-7pgpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:50:07 crc kubenswrapper[4803]: I0320 17:50:07.812066 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pgpv\" (UniqueName: \"kubernetes.io/projected/f3f88542-c200-48c2-a14e-ced4a4f7be3d-kube-api-access-7pgpv\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:08 crc kubenswrapper[4803]: I0320 17:50:08.246325 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:50:08 crc kubenswrapper[4803]: I0320 17:50:08.246382 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:50:08 crc kubenswrapper[4803]: I0320 17:50:08.265654 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567150-b897s" event={"ID":"f3f88542-c200-48c2-a14e-ced4a4f7be3d","Type":"ContainerDied","Data":"2c87ccaa854282be335e1023729cb945a919a88235d3ddf0518372454cc88e17"} Mar 20 17:50:08 crc kubenswrapper[4803]: I0320 17:50:08.265871 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c87ccaa854282be335e1023729cb945a919a88235d3ddf0518372454cc88e17" Mar 20 17:50:08 crc kubenswrapper[4803]: I0320 17:50:08.265737 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567150-b897s" Mar 20 17:50:08 crc kubenswrapper[4803]: I0320 17:50:08.678110 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-26gb2"] Mar 20 17:50:08 crc kubenswrapper[4803]: I0320 17:50:08.691022 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567144-26gb2"] Mar 20 17:50:08 crc kubenswrapper[4803]: I0320 17:50:08.858821 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f3ef92-1015-4a5c-aebe-556d2e7410f2" path="/var/lib/kubelet/pods/93f3ef92-1015-4a5c-aebe-556d2e7410f2/volumes" Mar 20 17:50:14 crc kubenswrapper[4803]: I0320 17:50:14.352010 4803 generic.go:334] "Generic (PLEG): container finished" podID="6aabf807-6d9b-4e6a-99db-ddfda9979b25" containerID="7bba28bddcbe3b49c4e46a663c38fdd1864b4c5e4b84ec3357d2c3ceb22c5dc5" exitCode=0 Mar 20 17:50:14 crc kubenswrapper[4803]: I0320 17:50:14.352105 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" event={"ID":"6aabf807-6d9b-4e6a-99db-ddfda9979b25","Type":"ContainerDied","Data":"7bba28bddcbe3b49c4e46a663c38fdd1864b4c5e4b84ec3357d2c3ceb22c5dc5"} Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.778849 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.880422 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-inventory\") pod \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.880481 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-nova-metadata-neutron-config-0\") pod \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.880520 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-ssh-key-openstack-edpm-ipam\") pod \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.880558 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dpmk\" (UniqueName: \"kubernetes.io/projected/6aabf807-6d9b-4e6a-99db-ddfda9979b25-kube-api-access-2dpmk\") pod \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.880693 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-metadata-combined-ca-bundle\") pod \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.880738 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-ovn-metadata-agent-neutron-config-0\") pod \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\" (UID: \"6aabf807-6d9b-4e6a-99db-ddfda9979b25\") " Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.887559 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aabf807-6d9b-4e6a-99db-ddfda9979b25-kube-api-access-2dpmk" (OuterVolumeSpecName: "kube-api-access-2dpmk") pod "6aabf807-6d9b-4e6a-99db-ddfda9979b25" (UID: "6aabf807-6d9b-4e6a-99db-ddfda9979b25"). InnerVolumeSpecName "kube-api-access-2dpmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.893176 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "6aabf807-6d9b-4e6a-99db-ddfda9979b25" (UID: "6aabf807-6d9b-4e6a-99db-ddfda9979b25"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.909663 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6aabf807-6d9b-4e6a-99db-ddfda9979b25" (UID: "6aabf807-6d9b-4e6a-99db-ddfda9979b25"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.913216 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-inventory" (OuterVolumeSpecName: "inventory") pod "6aabf807-6d9b-4e6a-99db-ddfda9979b25" (UID: "6aabf807-6d9b-4e6a-99db-ddfda9979b25"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.915236 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "6aabf807-6d9b-4e6a-99db-ddfda9979b25" (UID: "6aabf807-6d9b-4e6a-99db-ddfda9979b25"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.915256 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "6aabf807-6d9b-4e6a-99db-ddfda9979b25" (UID: "6aabf807-6d9b-4e6a-99db-ddfda9979b25"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.982888 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.982936 4803 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.982949 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.982962 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dpmk\" (UniqueName: \"kubernetes.io/projected/6aabf807-6d9b-4e6a-99db-ddfda9979b25-kube-api-access-2dpmk\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.982972 4803 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:15 crc kubenswrapper[4803]: I0320 17:50:15.982982 4803 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6aabf807-6d9b-4e6a-99db-ddfda9979b25-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.371025 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" event={"ID":"6aabf807-6d9b-4e6a-99db-ddfda9979b25","Type":"ContainerDied","Data":"902bb55df29262e31889139f4fb1eb9fe6a8acc939cc22b3fe5bb615630d792d"} Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.371340 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="902bb55df29262e31889139f4fb1eb9fe6a8acc939cc22b3fe5bb615630d792d" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.371257 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.530632 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz"] Mar 20 17:50:16 crc kubenswrapper[4803]: E0320 17:50:16.531154 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3f88542-c200-48c2-a14e-ced4a4f7be3d" containerName="oc" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.531181 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3f88542-c200-48c2-a14e-ced4a4f7be3d" containerName="oc" Mar 20 17:50:16 crc kubenswrapper[4803]: E0320 17:50:16.531194 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aabf807-6d9b-4e6a-99db-ddfda9979b25" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.531204 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aabf807-6d9b-4e6a-99db-ddfda9979b25" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.531407 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3f88542-c200-48c2-a14e-ced4a4f7be3d" containerName="oc" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.531435 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aabf807-6d9b-4e6a-99db-ddfda9979b25" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.532266 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.534108 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.534246 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.534396 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.535331 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.539351 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.551510 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz"] Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.595104 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.595309 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.595397 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.595501 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.595806 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf5xp\" (UniqueName: \"kubernetes.io/projected/f644be5e-0ec5-499a-a42d-b4381159e310-kube-api-access-jf5xp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.698126 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.698468 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.698667 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.698871 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf5xp\" (UniqueName: \"kubernetes.io/projected/f644be5e-0ec5-499a-a42d-b4381159e310-kube-api-access-jf5xp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.699058 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.702882 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.703410 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.705213 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.713281 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.739822 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf5xp\" (UniqueName: \"kubernetes.io/projected/f644be5e-0ec5-499a-a42d-b4381159e310-kube-api-access-jf5xp\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:16 crc kubenswrapper[4803]: I0320 17:50:16.850039 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:50:17 crc kubenswrapper[4803]: I0320 17:50:17.403299 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz"] Mar 20 17:50:18 crc kubenswrapper[4803]: I0320 17:50:18.388845 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" event={"ID":"f644be5e-0ec5-499a-a42d-b4381159e310","Type":"ContainerStarted","Data":"499341b6f7e46ce05999a2072b978485372329f2f7d1951942e0fbc82ea24201"} Mar 20 17:50:19 crc kubenswrapper[4803]: I0320 17:50:19.398222 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" event={"ID":"f644be5e-0ec5-499a-a42d-b4381159e310","Type":"ContainerStarted","Data":"4a0aace26aeabddd9513d7ce39f7a49bad437429dceb0ddd981c96718727fc2e"} Mar 20 17:50:19 crc kubenswrapper[4803]: I0320 17:50:19.428716 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" podStartSLOduration=2.108435973 podStartE2EDuration="3.427724168s" podCreationTimestamp="2026-03-20 17:50:16 +0000 UTC" firstStartedPulling="2026-03-20 17:50:17.425940099 +0000 UTC m=+2027.337532169" lastFinishedPulling="2026-03-20 17:50:18.745228294 +0000 UTC m=+2028.656820364" observedRunningTime="2026-03-20 17:50:19.422283382 +0000 UTC m=+2029.333875452" watchObservedRunningTime="2026-03-20 17:50:19.427724168 +0000 UTC m=+2029.339316238" Mar 20 17:50:23 crc kubenswrapper[4803]: I0320 17:50:23.762030 4803 scope.go:117] "RemoveContainer" containerID="4e6109e9fb0b0c46c1018e8d06715e4135977bd626b674382406c4a04806d0cb" Mar 20 17:50:38 crc kubenswrapper[4803]: I0320 17:50:38.245920 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:50:38 crc kubenswrapper[4803]: I0320 17:50:38.246423 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.245855 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.246389 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.246439 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.247249 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59c8765f1a44e70897205c8cc4ba9fd464249da755c0e1d0c4f525a2371cd37d"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.247303 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://59c8765f1a44e70897205c8cc4ba9fd464249da755c0e1d0c4f525a2371cd37d" gracePeriod=600 Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.846600 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="59c8765f1a44e70897205c8cc4ba9fd464249da755c0e1d0c4f525a2371cd37d" exitCode=0 Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.846677 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"59c8765f1a44e70897205c8cc4ba9fd464249da755c0e1d0c4f525a2371cd37d"} Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.847057 4803 scope.go:117] "RemoveContainer" containerID="de153388f66f8a2b032406cc7306392a9ac0e33c0e5da0341da7b5609f97160a" Mar 20 17:51:08 crc kubenswrapper[4803]: I0320 17:51:08.847793 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d"} Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.154754 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567152-nn7vb"] Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.156717 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-nn7vb" Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.159405 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.159453 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.159506 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.178413 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-nn7vb"] Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.262699 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxvx\" (UniqueName: \"kubernetes.io/projected/372304a1-9ac8-40a9-95d3-2e3efbc79546-kube-api-access-kqxvx\") pod \"auto-csr-approver-29567152-nn7vb\" (UID: \"372304a1-9ac8-40a9-95d3-2e3efbc79546\") " pod="openshift-infra/auto-csr-approver-29567152-nn7vb" Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.365418 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxvx\" (UniqueName: \"kubernetes.io/projected/372304a1-9ac8-40a9-95d3-2e3efbc79546-kube-api-access-kqxvx\") pod \"auto-csr-approver-29567152-nn7vb\" (UID: \"372304a1-9ac8-40a9-95d3-2e3efbc79546\") " pod="openshift-infra/auto-csr-approver-29567152-nn7vb" Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.385812 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxvx\" (UniqueName: \"kubernetes.io/projected/372304a1-9ac8-40a9-95d3-2e3efbc79546-kube-api-access-kqxvx\") pod \"auto-csr-approver-29567152-nn7vb\" (UID: \"372304a1-9ac8-40a9-95d3-2e3efbc79546\") " pod="openshift-infra/auto-csr-approver-29567152-nn7vb" Mar 20 17:52:00 crc kubenswrapper[4803]: I0320 17:52:00.508222 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-nn7vb" Mar 20 17:52:01 crc kubenswrapper[4803]: I0320 17:52:01.037504 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-nn7vb"] Mar 20 17:52:01 crc kubenswrapper[4803]: I0320 17:52:01.050585 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:52:01 crc kubenswrapper[4803]: I0320 17:52:01.335351 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-nn7vb" event={"ID":"372304a1-9ac8-40a9-95d3-2e3efbc79546","Type":"ContainerStarted","Data":"7dff0b1db23120843b56a8aa06b8d00d711732b9294c1ffbea222cbc1e75b5c6"} Mar 20 17:52:03 crc kubenswrapper[4803]: I0320 17:52:03.351508 4803 generic.go:334] "Generic (PLEG): container finished" podID="372304a1-9ac8-40a9-95d3-2e3efbc79546" containerID="c7b87214f927b52ece5ab15b4913f0215be32ca881915c61f5cc4cb7a464d1e9" exitCode=0 Mar 20 17:52:03 crc kubenswrapper[4803]: I0320 17:52:03.351566 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-nn7vb" event={"ID":"372304a1-9ac8-40a9-95d3-2e3efbc79546","Type":"ContainerDied","Data":"c7b87214f927b52ece5ab15b4913f0215be32ca881915c61f5cc4cb7a464d1e9"} Mar 20 17:52:04 crc kubenswrapper[4803]: I0320 17:52:04.766515 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-nn7vb" Mar 20 17:52:04 crc kubenswrapper[4803]: I0320 17:52:04.866141 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqxvx\" (UniqueName: \"kubernetes.io/projected/372304a1-9ac8-40a9-95d3-2e3efbc79546-kube-api-access-kqxvx\") pod \"372304a1-9ac8-40a9-95d3-2e3efbc79546\" (UID: \"372304a1-9ac8-40a9-95d3-2e3efbc79546\") " Mar 20 17:52:04 crc kubenswrapper[4803]: I0320 17:52:04.872870 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/372304a1-9ac8-40a9-95d3-2e3efbc79546-kube-api-access-kqxvx" (OuterVolumeSpecName: "kube-api-access-kqxvx") pod "372304a1-9ac8-40a9-95d3-2e3efbc79546" (UID: "372304a1-9ac8-40a9-95d3-2e3efbc79546"). InnerVolumeSpecName "kube-api-access-kqxvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:04 crc kubenswrapper[4803]: I0320 17:52:04.969156 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqxvx\" (UniqueName: \"kubernetes.io/projected/372304a1-9ac8-40a9-95d3-2e3efbc79546-kube-api-access-kqxvx\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:05 crc kubenswrapper[4803]: I0320 17:52:05.379821 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567152-nn7vb" event={"ID":"372304a1-9ac8-40a9-95d3-2e3efbc79546","Type":"ContainerDied","Data":"7dff0b1db23120843b56a8aa06b8d00d711732b9294c1ffbea222cbc1e75b5c6"} Mar 20 17:52:05 crc kubenswrapper[4803]: I0320 17:52:05.379868 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dff0b1db23120843b56a8aa06b8d00d711732b9294c1ffbea222cbc1e75b5c6" Mar 20 17:52:05 crc kubenswrapper[4803]: I0320 17:52:05.379873 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567152-nn7vb" Mar 20 17:52:05 crc kubenswrapper[4803]: I0320 17:52:05.867445 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-rkt8v"] Mar 20 17:52:05 crc kubenswrapper[4803]: I0320 17:52:05.877713 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567146-rkt8v"] Mar 20 17:52:06 crc kubenswrapper[4803]: I0320 17:52:06.871291 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac9282e-f66c-4b81-9145-e69a7924619a" path="/var/lib/kubelet/pods/9ac9282e-f66c-4b81-9145-e69a7924619a/volumes" Mar 20 17:52:23 crc kubenswrapper[4803]: I0320 17:52:23.843564 4803 scope.go:117] "RemoveContainer" containerID="8e43ee3e3a82158905f72270aca231bf716c96761bb7c284429872842e0eec29" Mar 20 17:52:34 crc kubenswrapper[4803]: I0320 17:52:34.884255 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5p9sj"] Mar 20 17:52:34 crc kubenswrapper[4803]: E0320 17:52:34.885439 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="372304a1-9ac8-40a9-95d3-2e3efbc79546" containerName="oc" Mar 20 17:52:34 crc kubenswrapper[4803]: I0320 17:52:34.885461 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="372304a1-9ac8-40a9-95d3-2e3efbc79546" containerName="oc" Mar 20 17:52:34 crc kubenswrapper[4803]: I0320 17:52:34.885838 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="372304a1-9ac8-40a9-95d3-2e3efbc79546" containerName="oc" Mar 20 17:52:34 crc kubenswrapper[4803]: I0320 17:52:34.888136 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:34 crc kubenswrapper[4803]: I0320 17:52:34.897700 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5p9sj"] Mar 20 17:52:34 crc kubenswrapper[4803]: I0320 17:52:34.996133 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-utilities\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:34 crc kubenswrapper[4803]: I0320 17:52:34.996420 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-catalog-content\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:34 crc kubenswrapper[4803]: I0320 17:52:34.996671 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgh6h\" (UniqueName: \"kubernetes.io/projected/f6af8689-f6bb-4382-89b2-57f4de219ce9-kube-api-access-lgh6h\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:35 crc kubenswrapper[4803]: I0320 17:52:35.098828 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-catalog-content\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:35 crc kubenswrapper[4803]: I0320 17:52:35.098969 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgh6h\" (UniqueName: \"kubernetes.io/projected/f6af8689-f6bb-4382-89b2-57f4de219ce9-kube-api-access-lgh6h\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:35 crc kubenswrapper[4803]: I0320 17:52:35.099121 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-utilities\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:35 crc kubenswrapper[4803]: I0320 17:52:35.099352 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-catalog-content\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:35 crc kubenswrapper[4803]: I0320 17:52:35.099615 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-utilities\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:35 crc kubenswrapper[4803]: I0320 17:52:35.119341 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgh6h\" (UniqueName: \"kubernetes.io/projected/f6af8689-f6bb-4382-89b2-57f4de219ce9-kube-api-access-lgh6h\") pod \"community-operators-5p9sj\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:35 crc kubenswrapper[4803]: I0320 17:52:35.224679 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:35 crc kubenswrapper[4803]: I0320 17:52:35.801767 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5p9sj"] Mar 20 17:52:36 crc kubenswrapper[4803]: I0320 17:52:36.713749 4803 generic.go:334] "Generic (PLEG): container finished" podID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerID="ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101" exitCode=0 Mar 20 17:52:36 crc kubenswrapper[4803]: I0320 17:52:36.713813 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p9sj" event={"ID":"f6af8689-f6bb-4382-89b2-57f4de219ce9","Type":"ContainerDied","Data":"ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101"} Mar 20 17:52:36 crc kubenswrapper[4803]: I0320 17:52:36.713895 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p9sj" event={"ID":"f6af8689-f6bb-4382-89b2-57f4de219ce9","Type":"ContainerStarted","Data":"e363c5c1fcbe1457f93588c705e9211c56992f873171d7360e24df6dec444471"} Mar 20 17:52:37 crc kubenswrapper[4803]: I0320 17:52:37.725723 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p9sj" event={"ID":"f6af8689-f6bb-4382-89b2-57f4de219ce9","Type":"ContainerStarted","Data":"3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11"} Mar 20 17:52:38 crc kubenswrapper[4803]: I0320 17:52:38.738340 4803 generic.go:334] "Generic (PLEG): container finished" podID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerID="3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11" exitCode=0 Mar 20 17:52:38 crc kubenswrapper[4803]: I0320 17:52:38.738399 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p9sj" event={"ID":"f6af8689-f6bb-4382-89b2-57f4de219ce9","Type":"ContainerDied","Data":"3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11"} Mar 20 17:52:39 crc kubenswrapper[4803]: I0320 17:52:39.748820 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p9sj" event={"ID":"f6af8689-f6bb-4382-89b2-57f4de219ce9","Type":"ContainerStarted","Data":"2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099"} Mar 20 17:52:39 crc kubenswrapper[4803]: I0320 17:52:39.769190 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5p9sj" podStartSLOduration=3.346015687 podStartE2EDuration="5.769167633s" podCreationTimestamp="2026-03-20 17:52:34 +0000 UTC" firstStartedPulling="2026-03-20 17:52:36.716042783 +0000 UTC m=+2166.627634863" lastFinishedPulling="2026-03-20 17:52:39.139194709 +0000 UTC m=+2169.050786809" observedRunningTime="2026-03-20 17:52:39.763850462 +0000 UTC m=+2169.675442542" watchObservedRunningTime="2026-03-20 17:52:39.769167633 +0000 UTC m=+2169.680759703" Mar 20 17:52:45 crc kubenswrapper[4803]: I0320 17:52:45.225145 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:45 crc kubenswrapper[4803]: I0320 17:52:45.225779 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:45 crc kubenswrapper[4803]: I0320 17:52:45.288811 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:45 crc kubenswrapper[4803]: I0320 17:52:45.840604 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:45 crc kubenswrapper[4803]: I0320 17:52:45.887094 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5p9sj"] Mar 20 17:52:47 crc kubenswrapper[4803]: I0320 17:52:47.814013 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5p9sj" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerName="registry-server" containerID="cri-o://2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099" gracePeriod=2 Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.265259 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.343337 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgh6h\" (UniqueName: \"kubernetes.io/projected/f6af8689-f6bb-4382-89b2-57f4de219ce9-kube-api-access-lgh6h\") pod \"f6af8689-f6bb-4382-89b2-57f4de219ce9\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.343806 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-utilities\") pod \"f6af8689-f6bb-4382-89b2-57f4de219ce9\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.343915 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-catalog-content\") pod \"f6af8689-f6bb-4382-89b2-57f4de219ce9\" (UID: \"f6af8689-f6bb-4382-89b2-57f4de219ce9\") " Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.345057 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-utilities" (OuterVolumeSpecName: "utilities") pod "f6af8689-f6bb-4382-89b2-57f4de219ce9" (UID: "f6af8689-f6bb-4382-89b2-57f4de219ce9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.354740 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6af8689-f6bb-4382-89b2-57f4de219ce9-kube-api-access-lgh6h" (OuterVolumeSpecName: "kube-api-access-lgh6h") pod "f6af8689-f6bb-4382-89b2-57f4de219ce9" (UID: "f6af8689-f6bb-4382-89b2-57f4de219ce9"). InnerVolumeSpecName "kube-api-access-lgh6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.390985 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6af8689-f6bb-4382-89b2-57f4de219ce9" (UID: "f6af8689-f6bb-4382-89b2-57f4de219ce9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.445573 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.445611 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgh6h\" (UniqueName: \"kubernetes.io/projected/f6af8689-f6bb-4382-89b2-57f4de219ce9-kube-api-access-lgh6h\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.445626 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6af8689-f6bb-4382-89b2-57f4de219ce9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.825487 4803 generic.go:334] "Generic (PLEG): container finished" podID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerID="2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099" exitCode=0 Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.825565 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p9sj" event={"ID":"f6af8689-f6bb-4382-89b2-57f4de219ce9","Type":"ContainerDied","Data":"2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099"} Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.825590 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5p9sj" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.825602 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5p9sj" event={"ID":"f6af8689-f6bb-4382-89b2-57f4de219ce9","Type":"ContainerDied","Data":"e363c5c1fcbe1457f93588c705e9211c56992f873171d7360e24df6dec444471"} Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.825631 4803 scope.go:117] "RemoveContainer" containerID="2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.855632 4803 scope.go:117] "RemoveContainer" containerID="3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.872966 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5p9sj"] Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.881852 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5p9sj"] Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.899657 4803 scope.go:117] "RemoveContainer" containerID="ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.940254 4803 scope.go:117] "RemoveContainer" containerID="2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099" Mar 20 17:52:48 crc kubenswrapper[4803]: E0320 17:52:48.940691 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099\": container with ID starting with 2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099 not found: ID does not exist" containerID="2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.940740 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099"} err="failed to get container status \"2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099\": rpc error: code = NotFound desc = could not find container \"2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099\": container with ID starting with 2de4d3eaacd28f7c1513aaf08e912076cd4312ad368d3d6d71542ea82fde6099 not found: ID does not exist" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.940780 4803 scope.go:117] "RemoveContainer" containerID="3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11" Mar 20 17:52:48 crc kubenswrapper[4803]: E0320 17:52:48.941070 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11\": container with ID starting with 3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11 not found: ID does not exist" containerID="3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.941108 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11"} err="failed to get container status \"3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11\": rpc error: code = NotFound desc = could not find container \"3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11\": container with ID starting with 3b3149d250a5023ad50491e0bdd457f472fc0ed9a34df33386016da8cda3da11 not found: ID does not exist" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.941136 4803 scope.go:117] "RemoveContainer" containerID="ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101" Mar 20 17:52:48 crc kubenswrapper[4803]: E0320 17:52:48.941484 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101\": container with ID starting with ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101 not found: ID does not exist" containerID="ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101" Mar 20 17:52:48 crc kubenswrapper[4803]: I0320 17:52:48.941506 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101"} err="failed to get container status \"ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101\": rpc error: code = NotFound desc = could not find container \"ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101\": container with ID starting with ec54a2e186ed092e96027390ceb960cc835d6f9d27b4232d7ce32770f97dc101 not found: ID does not exist" Mar 20 17:52:50 crc kubenswrapper[4803]: I0320 17:52:50.864497 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" path="/var/lib/kubelet/pods/f6af8689-f6bb-4382-89b2-57f4de219ce9/volumes" Mar 20 17:53:08 crc kubenswrapper[4803]: I0320 17:53:08.246028 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:53:08 crc kubenswrapper[4803]: I0320 17:53:08.246492 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:53:38 crc kubenswrapper[4803]: I0320 17:53:38.246168 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:53:38 crc kubenswrapper[4803]: I0320 17:53:38.246795 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.391775 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-szlvd"] Mar 20 17:53:53 crc kubenswrapper[4803]: E0320 17:53:53.393757 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerName="registry-server" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.393858 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerName="registry-server" Mar 20 17:53:53 crc kubenswrapper[4803]: E0320 17:53:53.393944 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerName="extract-content" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.394027 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerName="extract-content" Mar 20 17:53:53 crc kubenswrapper[4803]: E0320 17:53:53.394141 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerName="extract-utilities" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.394232 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerName="extract-utilities" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.394625 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6af8689-f6bb-4382-89b2-57f4de219ce9" containerName="registry-server" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.396296 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.405734 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szlvd"] Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.445094 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-utilities\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.445175 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-catalog-content\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.445256 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blq29\" (UniqueName: \"kubernetes.io/projected/028314ea-8df7-4b95-9ea2-2552481a74dc-kube-api-access-blq29\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.546412 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-utilities\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.546481 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-catalog-content\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.546604 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blq29\" (UniqueName: \"kubernetes.io/projected/028314ea-8df7-4b95-9ea2-2552481a74dc-kube-api-access-blq29\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.547407 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-utilities\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.547475 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-catalog-content\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.569494 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blq29\" (UniqueName: \"kubernetes.io/projected/028314ea-8df7-4b95-9ea2-2552481a74dc-kube-api-access-blq29\") pod \"redhat-marketplace-szlvd\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:53 crc kubenswrapper[4803]: I0320 17:53:53.724486 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:53:54 crc kubenswrapper[4803]: I0320 17:53:54.186612 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-szlvd"] Mar 20 17:53:54 crc kubenswrapper[4803]: I0320 17:53:54.484516 4803 generic.go:334] "Generic (PLEG): container finished" podID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerID="6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81" exitCode=0 Mar 20 17:53:54 crc kubenswrapper[4803]: I0320 17:53:54.484609 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szlvd" event={"ID":"028314ea-8df7-4b95-9ea2-2552481a74dc","Type":"ContainerDied","Data":"6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81"} Mar 20 17:53:54 crc kubenswrapper[4803]: I0320 17:53:54.484860 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szlvd" event={"ID":"028314ea-8df7-4b95-9ea2-2552481a74dc","Type":"ContainerStarted","Data":"431551c9df4728a8dcad4364022020586a324d9c503648e5cc9580bd0d64a571"} Mar 20 17:53:56 crc kubenswrapper[4803]: I0320 17:53:56.528186 4803 generic.go:334] "Generic (PLEG): container finished" podID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerID="e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89" exitCode=0 Mar 20 17:53:56 crc kubenswrapper[4803]: I0320 17:53:56.528248 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szlvd" event={"ID":"028314ea-8df7-4b95-9ea2-2552481a74dc","Type":"ContainerDied","Data":"e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89"} Mar 20 17:53:57 crc kubenswrapper[4803]: I0320 17:53:57.537455 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szlvd" event={"ID":"028314ea-8df7-4b95-9ea2-2552481a74dc","Type":"ContainerStarted","Data":"ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7"} Mar 20 17:53:57 crc kubenswrapper[4803]: I0320 17:53:57.561728 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-szlvd" podStartSLOduration=1.96088492 podStartE2EDuration="4.561706767s" podCreationTimestamp="2026-03-20 17:53:53 +0000 UTC" firstStartedPulling="2026-03-20 17:53:54.486379484 +0000 UTC m=+2244.397971594" lastFinishedPulling="2026-03-20 17:53:57.087201371 +0000 UTC m=+2246.998793441" observedRunningTime="2026-03-20 17:53:57.554480591 +0000 UTC m=+2247.466072681" watchObservedRunningTime="2026-03-20 17:53:57.561706767 +0000 UTC m=+2247.473298837" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.152192 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567154-b5d9q"] Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.153900 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-b5d9q" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.157734 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.157977 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.159855 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.165121 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-b5d9q"] Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.336848 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rk5n\" (UniqueName: \"kubernetes.io/projected/3b57f205-75ea-4ba3-899c-611ed863cff7-kube-api-access-7rk5n\") pod \"auto-csr-approver-29567154-b5d9q\" (UID: \"3b57f205-75ea-4ba3-899c-611ed863cff7\") " pod="openshift-infra/auto-csr-approver-29567154-b5d9q" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.438299 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rk5n\" (UniqueName: \"kubernetes.io/projected/3b57f205-75ea-4ba3-899c-611ed863cff7-kube-api-access-7rk5n\") pod \"auto-csr-approver-29567154-b5d9q\" (UID: \"3b57f205-75ea-4ba3-899c-611ed863cff7\") " pod="openshift-infra/auto-csr-approver-29567154-b5d9q" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.460844 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rk5n\" (UniqueName: \"kubernetes.io/projected/3b57f205-75ea-4ba3-899c-611ed863cff7-kube-api-access-7rk5n\") pod \"auto-csr-approver-29567154-b5d9q\" (UID: \"3b57f205-75ea-4ba3-899c-611ed863cff7\") " pod="openshift-infra/auto-csr-approver-29567154-b5d9q" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.476911 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-b5d9q" Mar 20 17:54:00 crc kubenswrapper[4803]: I0320 17:54:00.949267 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-b5d9q"] Mar 20 17:54:00 crc kubenswrapper[4803]: W0320 17:54:00.951890 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b57f205_75ea_4ba3_899c_611ed863cff7.slice/crio-ceb2a9d0ddb6717440ac02525d4cdb7eb07431109e77046bce2d5ba7ee20d212 WatchSource:0}: Error finding container ceb2a9d0ddb6717440ac02525d4cdb7eb07431109e77046bce2d5ba7ee20d212: Status 404 returned error can't find the container with id ceb2a9d0ddb6717440ac02525d4cdb7eb07431109e77046bce2d5ba7ee20d212 Mar 20 17:54:01 crc kubenswrapper[4803]: I0320 17:54:01.578442 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-b5d9q" event={"ID":"3b57f205-75ea-4ba3-899c-611ed863cff7","Type":"ContainerStarted","Data":"ceb2a9d0ddb6717440ac02525d4cdb7eb07431109e77046bce2d5ba7ee20d212"} Mar 20 17:54:02 crc kubenswrapper[4803]: I0320 17:54:02.588754 4803 generic.go:334] "Generic (PLEG): container finished" podID="3b57f205-75ea-4ba3-899c-611ed863cff7" containerID="0959dc77a9f2d2aa346835d08a9ec8d7137fe0443d04b6706d0d3fe595e5af9d" exitCode=0 Mar 20 17:54:02 crc kubenswrapper[4803]: I0320 17:54:02.588819 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-b5d9q" event={"ID":"3b57f205-75ea-4ba3-899c-611ed863cff7","Type":"ContainerDied","Data":"0959dc77a9f2d2aa346835d08a9ec8d7137fe0443d04b6706d0d3fe595e5af9d"} Mar 20 17:54:03 crc kubenswrapper[4803]: I0320 17:54:03.724709 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:54:03 crc kubenswrapper[4803]: I0320 17:54:03.725021 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:54:03 crc kubenswrapper[4803]: I0320 17:54:03.775650 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:54:03 crc kubenswrapper[4803]: I0320 17:54:03.896516 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-b5d9q" Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.001584 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rk5n\" (UniqueName: \"kubernetes.io/projected/3b57f205-75ea-4ba3-899c-611ed863cff7-kube-api-access-7rk5n\") pod \"3b57f205-75ea-4ba3-899c-611ed863cff7\" (UID: \"3b57f205-75ea-4ba3-899c-611ed863cff7\") " Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.008726 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b57f205-75ea-4ba3-899c-611ed863cff7-kube-api-access-7rk5n" (OuterVolumeSpecName: "kube-api-access-7rk5n") pod "3b57f205-75ea-4ba3-899c-611ed863cff7" (UID: "3b57f205-75ea-4ba3-899c-611ed863cff7"). InnerVolumeSpecName "kube-api-access-7rk5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.111887 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rk5n\" (UniqueName: \"kubernetes.io/projected/3b57f205-75ea-4ba3-899c-611ed863cff7-kube-api-access-7rk5n\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.607713 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567154-b5d9q" event={"ID":"3b57f205-75ea-4ba3-899c-611ed863cff7","Type":"ContainerDied","Data":"ceb2a9d0ddb6717440ac02525d4cdb7eb07431109e77046bce2d5ba7ee20d212"} Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.608117 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceb2a9d0ddb6717440ac02525d4cdb7eb07431109e77046bce2d5ba7ee20d212" Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.607764 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567154-b5d9q" Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.650219 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.697497 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szlvd"] Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.960153 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-lbscd"] Mar 20 17:54:04 crc kubenswrapper[4803]: I0320 17:54:04.968019 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567148-lbscd"] Mar 20 17:54:06 crc kubenswrapper[4803]: I0320 17:54:06.623471 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-szlvd" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerName="registry-server" containerID="cri-o://ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7" gracePeriod=2 Mar 20 17:54:06 crc kubenswrapper[4803]: I0320 17:54:06.870348 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac7e04c6-45d6-4f90-ba80-b9d56303fccb" path="/var/lib/kubelet/pods/ac7e04c6-45d6-4f90-ba80-b9d56303fccb/volumes" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.054267 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.168257 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blq29\" (UniqueName: \"kubernetes.io/projected/028314ea-8df7-4b95-9ea2-2552481a74dc-kube-api-access-blq29\") pod \"028314ea-8df7-4b95-9ea2-2552481a74dc\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.168335 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-catalog-content\") pod \"028314ea-8df7-4b95-9ea2-2552481a74dc\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.168371 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-utilities\") pod \"028314ea-8df7-4b95-9ea2-2552481a74dc\" (UID: \"028314ea-8df7-4b95-9ea2-2552481a74dc\") " Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.169694 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-utilities" (OuterVolumeSpecName: "utilities") pod "028314ea-8df7-4b95-9ea2-2552481a74dc" (UID: "028314ea-8df7-4b95-9ea2-2552481a74dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.177699 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/028314ea-8df7-4b95-9ea2-2552481a74dc-kube-api-access-blq29" (OuterVolumeSpecName: "kube-api-access-blq29") pod "028314ea-8df7-4b95-9ea2-2552481a74dc" (UID: "028314ea-8df7-4b95-9ea2-2552481a74dc"). InnerVolumeSpecName "kube-api-access-blq29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.196568 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "028314ea-8df7-4b95-9ea2-2552481a74dc" (UID: "028314ea-8df7-4b95-9ea2-2552481a74dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.270718 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blq29\" (UniqueName: \"kubernetes.io/projected/028314ea-8df7-4b95-9ea2-2552481a74dc-kube-api-access-blq29\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.270756 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.270765 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028314ea-8df7-4b95-9ea2-2552481a74dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.633326 4803 generic.go:334] "Generic (PLEG): container finished" podID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerID="ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7" exitCode=0 Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.633366 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szlvd" event={"ID":"028314ea-8df7-4b95-9ea2-2552481a74dc","Type":"ContainerDied","Data":"ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7"} Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.633394 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-szlvd" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.633416 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-szlvd" event={"ID":"028314ea-8df7-4b95-9ea2-2552481a74dc","Type":"ContainerDied","Data":"431551c9df4728a8dcad4364022020586a324d9c503648e5cc9580bd0d64a571"} Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.633435 4803 scope.go:117] "RemoveContainer" containerID="ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.659372 4803 scope.go:117] "RemoveContainer" containerID="e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.684256 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-szlvd"] Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.705393 4803 scope.go:117] "RemoveContainer" containerID="6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.705898 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-szlvd"] Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.743298 4803 scope.go:117] "RemoveContainer" containerID="ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7" Mar 20 17:54:07 crc kubenswrapper[4803]: E0320 17:54:07.743997 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7\": container with ID starting with ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7 not found: ID does not exist" containerID="ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.744033 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7"} err="failed to get container status \"ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7\": rpc error: code = NotFound desc = could not find container \"ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7\": container with ID starting with ebb6e787b2fe40f74181b5354e0ef2c77b18989c60f3cabea868bc664b1096e7 not found: ID does not exist" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.744054 4803 scope.go:117] "RemoveContainer" containerID="e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89" Mar 20 17:54:07 crc kubenswrapper[4803]: E0320 17:54:07.744506 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89\": container with ID starting with e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89 not found: ID does not exist" containerID="e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.744625 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89"} err="failed to get container status \"e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89\": rpc error: code = NotFound desc = could not find container \"e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89\": container with ID starting with e6a52136930e43c12c4b228240dd7710157bdc88172fd10fdb35c69a92930c89 not found: ID does not exist" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.744645 4803 scope.go:117] "RemoveContainer" containerID="6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81" Mar 20 17:54:07 crc kubenswrapper[4803]: E0320 17:54:07.744912 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81\": container with ID starting with 6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81 not found: ID does not exist" containerID="6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81" Mar 20 17:54:07 crc kubenswrapper[4803]: I0320 17:54:07.744935 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81"} err="failed to get container status \"6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81\": rpc error: code = NotFound desc = could not find container \"6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81\": container with ID starting with 6c92984009858237e1e9a9c6b8272ee4dfb97ffbada7adf127cddb8c8edb2c81 not found: ID does not exist" Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.246210 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.246269 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.246318 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.247108 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.247173 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" gracePeriod=600 Mar 20 17:54:08 crc kubenswrapper[4803]: E0320 17:54:08.365443 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.646518 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" exitCode=0 Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.646592 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d"} Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.646663 4803 scope.go:117] "RemoveContainer" containerID="59c8765f1a44e70897205c8cc4ba9fd464249da755c0e1d0c4f525a2371cd37d" Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.647655 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:54:08 crc kubenswrapper[4803]: E0320 17:54:08.648102 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:54:08 crc kubenswrapper[4803]: I0320 17:54:08.861069 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" path="/var/lib/kubelet/pods/028314ea-8df7-4b95-9ea2-2552481a74dc/volumes" Mar 20 17:54:12 crc kubenswrapper[4803]: I0320 17:54:12.764704 4803 generic.go:334] "Generic (PLEG): container finished" podID="f644be5e-0ec5-499a-a42d-b4381159e310" containerID="4a0aace26aeabddd9513d7ce39f7a49bad437429dceb0ddd981c96718727fc2e" exitCode=0 Mar 20 17:54:12 crc kubenswrapper[4803]: I0320 17:54:12.764953 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" event={"ID":"f644be5e-0ec5-499a-a42d-b4381159e310","Type":"ContainerDied","Data":"4a0aace26aeabddd9513d7ce39f7a49bad437429dceb0ddd981c96718727fc2e"} Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.310213 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.438416 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-inventory\") pod \"f644be5e-0ec5-499a-a42d-b4381159e310\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.438625 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-ssh-key-openstack-edpm-ipam\") pod \"f644be5e-0ec5-499a-a42d-b4381159e310\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.438658 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-combined-ca-bundle\") pod \"f644be5e-0ec5-499a-a42d-b4381159e310\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.438781 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-secret-0\") pod \"f644be5e-0ec5-499a-a42d-b4381159e310\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.438805 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf5xp\" (UniqueName: \"kubernetes.io/projected/f644be5e-0ec5-499a-a42d-b4381159e310-kube-api-access-jf5xp\") pod \"f644be5e-0ec5-499a-a42d-b4381159e310\" (UID: \"f644be5e-0ec5-499a-a42d-b4381159e310\") " Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.443759 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f644be5e-0ec5-499a-a42d-b4381159e310-kube-api-access-jf5xp" (OuterVolumeSpecName: "kube-api-access-jf5xp") pod "f644be5e-0ec5-499a-a42d-b4381159e310" (UID: "f644be5e-0ec5-499a-a42d-b4381159e310"). InnerVolumeSpecName "kube-api-access-jf5xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.452980 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f644be5e-0ec5-499a-a42d-b4381159e310" (UID: "f644be5e-0ec5-499a-a42d-b4381159e310"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.464158 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-inventory" (OuterVolumeSpecName: "inventory") pod "f644be5e-0ec5-499a-a42d-b4381159e310" (UID: "f644be5e-0ec5-499a-a42d-b4381159e310"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.467177 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f644be5e-0ec5-499a-a42d-b4381159e310" (UID: "f644be5e-0ec5-499a-a42d-b4381159e310"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.475199 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "f644be5e-0ec5-499a-a42d-b4381159e310" (UID: "f644be5e-0ec5-499a-a42d-b4381159e310"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.541177 4803 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.541465 4803 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.541483 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf5xp\" (UniqueName: \"kubernetes.io/projected/f644be5e-0ec5-499a-a42d-b4381159e310-kube-api-access-jf5xp\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.541503 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.541548 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f644be5e-0ec5-499a-a42d-b4381159e310-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.785347 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" event={"ID":"f644be5e-0ec5-499a-a42d-b4381159e310","Type":"ContainerDied","Data":"499341b6f7e46ce05999a2072b978485372329f2f7d1951942e0fbc82ea24201"} Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.785394 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="499341b6f7e46ce05999a2072b978485372329f2f7d1951942e0fbc82ea24201" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.785434 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.913819 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2"] Mar 20 17:54:14 crc kubenswrapper[4803]: E0320 17:54:14.914203 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerName="registry-server" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.914220 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerName="registry-server" Mar 20 17:54:14 crc kubenswrapper[4803]: E0320 17:54:14.914246 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerName="extract-content" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.914255 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerName="extract-content" Mar 20 17:54:14 crc kubenswrapper[4803]: E0320 17:54:14.914275 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerName="extract-utilities" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.914282 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerName="extract-utilities" Mar 20 17:54:14 crc kubenswrapper[4803]: E0320 17:54:14.914294 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f644be5e-0ec5-499a-a42d-b4381159e310" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.914300 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="f644be5e-0ec5-499a-a42d-b4381159e310" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:14 crc kubenswrapper[4803]: E0320 17:54:14.914313 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b57f205-75ea-4ba3-899c-611ed863cff7" containerName="oc" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.914318 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b57f205-75ea-4ba3-899c-611ed863cff7" containerName="oc" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.914499 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b57f205-75ea-4ba3-899c-611ed863cff7" containerName="oc" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.914514 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="028314ea-8df7-4b95-9ea2-2552481a74dc" containerName="registry-server" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.914644 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="f644be5e-0ec5-499a-a42d-b4381159e310" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.915206 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.917476 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.917492 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.917638 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.917810 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.917973 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.918082 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.918263 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:54:14 crc kubenswrapper[4803]: I0320 17:54:14.944146 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2"] Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052184 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052256 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052293 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052329 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052410 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6bj7\" (UniqueName: \"kubernetes.io/projected/0b06d67e-e61d-499b-bce7-41b3f0b3b509-kube-api-access-s6bj7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052462 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052662 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052772 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052852 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.052942 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.053034 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.155722 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6bj7\" (UniqueName: \"kubernetes.io/projected/0b06d67e-e61d-499b-bce7-41b3f0b3b509-kube-api-access-s6bj7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.155836 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.155879 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.155932 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.155988 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.156054 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.156114 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.156153 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.156210 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.156265 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.156313 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.156878 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.161047 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.162266 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.163471 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.164166 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.164826 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.165434 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.165623 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.166008 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.173094 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.180223 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6bj7\" (UniqueName: \"kubernetes.io/projected/0b06d67e-e61d-499b-bce7-41b3f0b3b509-kube-api-access-s6bj7\") pod \"nova-edpm-deployment-openstack-edpm-ipam-9nrx2\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.245896 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.627163 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2"] Mar 20 17:54:15 crc kubenswrapper[4803]: W0320 17:54:15.628635 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b06d67e_e61d_499b_bce7_41b3f0b3b509.slice/crio-f1cba03ca6f6c71ecabbfedff142299174336e1a73d04bfdb9cd1ac1e8a115aa WatchSource:0}: Error finding container f1cba03ca6f6c71ecabbfedff142299174336e1a73d04bfdb9cd1ac1e8a115aa: Status 404 returned error can't find the container with id f1cba03ca6f6c71ecabbfedff142299174336e1a73d04bfdb9cd1ac1e8a115aa Mar 20 17:54:15 crc kubenswrapper[4803]: I0320 17:54:15.800461 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" event={"ID":"0b06d67e-e61d-499b-bce7-41b3f0b3b509","Type":"ContainerStarted","Data":"f1cba03ca6f6c71ecabbfedff142299174336e1a73d04bfdb9cd1ac1e8a115aa"} Mar 20 17:54:18 crc kubenswrapper[4803]: I0320 17:54:18.834468 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" event={"ID":"0b06d67e-e61d-499b-bce7-41b3f0b3b509","Type":"ContainerStarted","Data":"07d21beeecd50b1e7e7ac52340f6a0e8c66631e7f46f9ebc2e7d6fa4e36504a6"} Mar 20 17:54:18 crc kubenswrapper[4803]: I0320 17:54:18.878312 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" podStartSLOduration=2.970434996 podStartE2EDuration="4.878280537s" podCreationTimestamp="2026-03-20 17:54:14 +0000 UTC" firstStartedPulling="2026-03-20 17:54:15.631668744 +0000 UTC m=+2265.543260814" lastFinishedPulling="2026-03-20 17:54:17.539514285 +0000 UTC m=+2267.451106355" observedRunningTime="2026-03-20 17:54:18.867916442 +0000 UTC m=+2268.779508552" watchObservedRunningTime="2026-03-20 17:54:18.878280537 +0000 UTC m=+2268.789872637" Mar 20 17:54:21 crc kubenswrapper[4803]: I0320 17:54:21.848766 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:54:21 crc kubenswrapper[4803]: E0320 17:54:21.849453 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:54:23 crc kubenswrapper[4803]: I0320 17:54:23.956643 4803 scope.go:117] "RemoveContainer" containerID="d599b5fc8384c0b39acfc0f027de3e6589dc2e228a077a9e4ada5edf8689b8d8" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.275036 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q87mb"] Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.280713 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.321090 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q87mb"] Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.355975 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-catalog-content\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.356056 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-utilities\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.356119 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6mdm\" (UniqueName: \"kubernetes.io/projected/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-kube-api-access-b6mdm\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.457375 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-catalog-content\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.457461 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-utilities\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.457901 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-utilities\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.457500 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6mdm\" (UniqueName: \"kubernetes.io/projected/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-kube-api-access-b6mdm\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.458021 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-catalog-content\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.476651 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6mdm\" (UniqueName: \"kubernetes.io/projected/67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c-kube-api-access-b6mdm\") pod \"certified-operators-q87mb\" (UID: \"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c\") " pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:30 crc kubenswrapper[4803]: I0320 17:54:30.601043 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:31 crc kubenswrapper[4803]: I0320 17:54:31.053030 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q87mb"] Mar 20 17:54:31 crc kubenswrapper[4803]: W0320 17:54:31.054807 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67f8c88e_3bdc_403b_8ed0_8c1c2cab9d6c.slice/crio-d7df05971056ecbfd14e1e4f90b1356faa71149a44d76917b10f744424dae073 WatchSource:0}: Error finding container d7df05971056ecbfd14e1e4f90b1356faa71149a44d76917b10f744424dae073: Status 404 returned error can't find the container with id d7df05971056ecbfd14e1e4f90b1356faa71149a44d76917b10f744424dae073 Mar 20 17:54:31 crc kubenswrapper[4803]: I0320 17:54:31.948297 4803 generic.go:334] "Generic (PLEG): container finished" podID="67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c" containerID="e9afc254355a246966ed8f49c99eaac7e8dd9f6f586b72bbe74223184523868d" exitCode=0 Mar 20 17:54:31 crc kubenswrapper[4803]: I0320 17:54:31.948360 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q87mb" event={"ID":"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c","Type":"ContainerDied","Data":"e9afc254355a246966ed8f49c99eaac7e8dd9f6f586b72bbe74223184523868d"} Mar 20 17:54:31 crc kubenswrapper[4803]: I0320 17:54:31.948674 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q87mb" event={"ID":"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c","Type":"ContainerStarted","Data":"d7df05971056ecbfd14e1e4f90b1356faa71149a44d76917b10f744424dae073"} Mar 20 17:54:34 crc kubenswrapper[4803]: I0320 17:54:34.849388 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:54:34 crc kubenswrapper[4803]: E0320 17:54:34.850088 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:54:39 crc kubenswrapper[4803]: I0320 17:54:39.019585 4803 generic.go:334] "Generic (PLEG): container finished" podID="67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c" containerID="8506df8922d779ff466a62bf418c8c137b756c86e6f652a42ab430c0a611bf59" exitCode=0 Mar 20 17:54:39 crc kubenswrapper[4803]: I0320 17:54:39.019638 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q87mb" event={"ID":"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c","Type":"ContainerDied","Data":"8506df8922d779ff466a62bf418c8c137b756c86e6f652a42ab430c0a611bf59"} Mar 20 17:54:40 crc kubenswrapper[4803]: I0320 17:54:40.039342 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q87mb" event={"ID":"67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c","Type":"ContainerStarted","Data":"0fcfb7934ccaed65f506197f0a73d86c6fda08226461f5af45040f8f8895ab89"} Mar 20 17:54:40 crc kubenswrapper[4803]: I0320 17:54:40.061945 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q87mb" podStartSLOduration=2.5860735 podStartE2EDuration="10.061916663s" podCreationTimestamp="2026-03-20 17:54:30 +0000 UTC" firstStartedPulling="2026-03-20 17:54:31.953622663 +0000 UTC m=+2281.865214733" lastFinishedPulling="2026-03-20 17:54:39.429465816 +0000 UTC m=+2289.341057896" observedRunningTime="2026-03-20 17:54:40.057746354 +0000 UTC m=+2289.969338444" watchObservedRunningTime="2026-03-20 17:54:40.061916663 +0000 UTC m=+2289.973508733" Mar 20 17:54:40 crc kubenswrapper[4803]: I0320 17:54:40.601740 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:40 crc kubenswrapper[4803]: I0320 17:54:40.602220 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:41 crc kubenswrapper[4803]: I0320 17:54:41.664318 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-q87mb" podUID="67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c" containerName="registry-server" probeResult="failure" output=< Mar 20 17:54:41 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:54:41 crc kubenswrapper[4803]: > Mar 20 17:54:48 crc kubenswrapper[4803]: I0320 17:54:48.848505 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:54:48 crc kubenswrapper[4803]: E0320 17:54:48.849699 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:54:50 crc kubenswrapper[4803]: I0320 17:54:50.675218 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:50 crc kubenswrapper[4803]: I0320 17:54:50.751075 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q87mb" Mar 20 17:54:50 crc kubenswrapper[4803]: I0320 17:54:50.881959 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q87mb"] Mar 20 17:54:50 crc kubenswrapper[4803]: I0320 17:54:50.926260 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hbxm"] Mar 20 17:54:50 crc kubenswrapper[4803]: I0320 17:54:50.926491 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hbxm" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerName="registry-server" containerID="cri-o://3f7841838c08208c6fee3631458944ecb2c6989a1b04d346a9b6050191873234" gracePeriod=2 Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.152229 4803 generic.go:334] "Generic (PLEG): container finished" podID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerID="3f7841838c08208c6fee3631458944ecb2c6989a1b04d346a9b6050191873234" exitCode=0 Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.153729 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hbxm" event={"ID":"d1eeda39-65c5-416d-9ad7-1de7c20e49f7","Type":"ContainerDied","Data":"3f7841838c08208c6fee3631458944ecb2c6989a1b04d346a9b6050191873234"} Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.378552 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.475155 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-catalog-content\") pod \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.475271 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6flqq\" (UniqueName: \"kubernetes.io/projected/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-kube-api-access-6flqq\") pod \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.475382 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-utilities\") pod \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\" (UID: \"d1eeda39-65c5-416d-9ad7-1de7c20e49f7\") " Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.476240 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-utilities" (OuterVolumeSpecName: "utilities") pod "d1eeda39-65c5-416d-9ad7-1de7c20e49f7" (UID: "d1eeda39-65c5-416d-9ad7-1de7c20e49f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.483668 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-kube-api-access-6flqq" (OuterVolumeSpecName: "kube-api-access-6flqq") pod "d1eeda39-65c5-416d-9ad7-1de7c20e49f7" (UID: "d1eeda39-65c5-416d-9ad7-1de7c20e49f7"). InnerVolumeSpecName "kube-api-access-6flqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.524363 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1eeda39-65c5-416d-9ad7-1de7c20e49f7" (UID: "d1eeda39-65c5-416d-9ad7-1de7c20e49f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.577699 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.577739 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6flqq\" (UniqueName: \"kubernetes.io/projected/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-kube-api-access-6flqq\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:51 crc kubenswrapper[4803]: I0320 17:54:51.577755 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1eeda39-65c5-416d-9ad7-1de7c20e49f7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:54:52 crc kubenswrapper[4803]: I0320 17:54:52.162502 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hbxm" event={"ID":"d1eeda39-65c5-416d-9ad7-1de7c20e49f7","Type":"ContainerDied","Data":"4e276ff67fe28843ac299db8ed322860c82d5aaa81c24ce20a01cf9a3e14af1e"} Mar 20 17:54:52 crc kubenswrapper[4803]: I0320 17:54:52.162567 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hbxm" Mar 20 17:54:52 crc kubenswrapper[4803]: I0320 17:54:52.162612 4803 scope.go:117] "RemoveContainer" containerID="3f7841838c08208c6fee3631458944ecb2c6989a1b04d346a9b6050191873234" Mar 20 17:54:52 crc kubenswrapper[4803]: I0320 17:54:52.184110 4803 scope.go:117] "RemoveContainer" containerID="eb5434a31ce7079dbc36cbbc9b2004e38a5675dbfc5fa61195f71f2100c86798" Mar 20 17:54:52 crc kubenswrapper[4803]: I0320 17:54:52.198698 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hbxm"] Mar 20 17:54:52 crc kubenswrapper[4803]: I0320 17:54:52.206224 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hbxm"] Mar 20 17:54:52 crc kubenswrapper[4803]: I0320 17:54:52.242179 4803 scope.go:117] "RemoveContainer" containerID="0582dca2ee1d31a8801270f1a7afa0eedaa47141b11444525247240a29a52b39" Mar 20 17:54:52 crc kubenswrapper[4803]: I0320 17:54:52.862814 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" path="/var/lib/kubelet/pods/d1eeda39-65c5-416d-9ad7-1de7c20e49f7/volumes" Mar 20 17:54:59 crc kubenswrapper[4803]: I0320 17:54:59.848253 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:54:59 crc kubenswrapper[4803]: E0320 17:54:59.849365 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.519157 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6sv4k"] Mar 20 17:55:04 crc kubenswrapper[4803]: E0320 17:55:04.520346 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerName="extract-content" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.520370 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerName="extract-content" Mar 20 17:55:04 crc kubenswrapper[4803]: E0320 17:55:04.520393 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerName="extract-utilities" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.520402 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerName="extract-utilities" Mar 20 17:55:04 crc kubenswrapper[4803]: E0320 17:55:04.520452 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerName="registry-server" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.520462 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerName="registry-server" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.520702 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1eeda39-65c5-416d-9ad7-1de7c20e49f7" containerName="registry-server" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.522631 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.532686 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6sv4k"] Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.660117 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbrx\" (UniqueName: \"kubernetes.io/projected/383bb4a7-0420-4b4d-b283-1a99ee87005c-kube-api-access-qvbrx\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.660219 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-catalog-content\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.660418 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-utilities\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.761837 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbrx\" (UniqueName: \"kubernetes.io/projected/383bb4a7-0420-4b4d-b283-1a99ee87005c-kube-api-access-qvbrx\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.761893 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-catalog-content\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.761994 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-utilities\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.762494 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-utilities\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.762592 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-catalog-content\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.789826 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbrx\" (UniqueName: \"kubernetes.io/projected/383bb4a7-0420-4b4d-b283-1a99ee87005c-kube-api-access-qvbrx\") pod \"redhat-operators-6sv4k\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:04 crc kubenswrapper[4803]: I0320 17:55:04.844453 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:05 crc kubenswrapper[4803]: I0320 17:55:05.326335 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6sv4k"] Mar 20 17:55:05 crc kubenswrapper[4803]: I0320 17:55:05.668239 4803 generic.go:334] "Generic (PLEG): container finished" podID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerID="177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980" exitCode=0 Mar 20 17:55:05 crc kubenswrapper[4803]: I0320 17:55:05.668331 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sv4k" event={"ID":"383bb4a7-0420-4b4d-b283-1a99ee87005c","Type":"ContainerDied","Data":"177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980"} Mar 20 17:55:05 crc kubenswrapper[4803]: I0320 17:55:05.669608 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sv4k" event={"ID":"383bb4a7-0420-4b4d-b283-1a99ee87005c","Type":"ContainerStarted","Data":"ecff8b3965a8011daf517b9737e557e31669f41c06d646b0901ee6b568aecbd0"} Mar 20 17:55:06 crc kubenswrapper[4803]: I0320 17:55:06.680862 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sv4k" event={"ID":"383bb4a7-0420-4b4d-b283-1a99ee87005c","Type":"ContainerStarted","Data":"25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9"} Mar 20 17:55:09 crc kubenswrapper[4803]: I0320 17:55:09.715901 4803 generic.go:334] "Generic (PLEG): container finished" podID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerID="25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9" exitCode=0 Mar 20 17:55:09 crc kubenswrapper[4803]: I0320 17:55:09.716610 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sv4k" event={"ID":"383bb4a7-0420-4b4d-b283-1a99ee87005c","Type":"ContainerDied","Data":"25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9"} Mar 20 17:55:10 crc kubenswrapper[4803]: I0320 17:55:10.727610 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sv4k" event={"ID":"383bb4a7-0420-4b4d-b283-1a99ee87005c","Type":"ContainerStarted","Data":"1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0"} Mar 20 17:55:10 crc kubenswrapper[4803]: I0320 17:55:10.747518 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6sv4k" podStartSLOduration=2.057069711 podStartE2EDuration="6.747500078s" podCreationTimestamp="2026-03-20 17:55:04 +0000 UTC" firstStartedPulling="2026-03-20 17:55:05.669886927 +0000 UTC m=+2315.581478997" lastFinishedPulling="2026-03-20 17:55:10.360317284 +0000 UTC m=+2320.271909364" observedRunningTime="2026-03-20 17:55:10.746438658 +0000 UTC m=+2320.658030748" watchObservedRunningTime="2026-03-20 17:55:10.747500078 +0000 UTC m=+2320.659092148" Mar 20 17:55:10 crc kubenswrapper[4803]: I0320 17:55:10.855648 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:55:10 crc kubenswrapper[4803]: E0320 17:55:10.855900 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:55:14 crc kubenswrapper[4803]: I0320 17:55:14.844977 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:14 crc kubenswrapper[4803]: I0320 17:55:14.845395 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:15 crc kubenswrapper[4803]: I0320 17:55:15.913107 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6sv4k" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="registry-server" probeResult="failure" output=< Mar 20 17:55:15 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 17:55:15 crc kubenswrapper[4803]: > Mar 20 17:55:24 crc kubenswrapper[4803]: I0320 17:55:24.848179 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:55:24 crc kubenswrapper[4803]: E0320 17:55:24.849243 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:55:24 crc kubenswrapper[4803]: I0320 17:55:24.912875 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:24 crc kubenswrapper[4803]: I0320 17:55:24.961282 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:25 crc kubenswrapper[4803]: I0320 17:55:25.170675 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6sv4k"] Mar 20 17:55:26 crc kubenswrapper[4803]: I0320 17:55:26.869006 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6sv4k" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="registry-server" containerID="cri-o://1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0" gracePeriod=2 Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.295139 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.446728 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-utilities\") pod \"383bb4a7-0420-4b4d-b283-1a99ee87005c\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.446917 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-catalog-content\") pod \"383bb4a7-0420-4b4d-b283-1a99ee87005c\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.446989 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvbrx\" (UniqueName: \"kubernetes.io/projected/383bb4a7-0420-4b4d-b283-1a99ee87005c-kube-api-access-qvbrx\") pod \"383bb4a7-0420-4b4d-b283-1a99ee87005c\" (UID: \"383bb4a7-0420-4b4d-b283-1a99ee87005c\") " Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.448320 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-utilities" (OuterVolumeSpecName: "utilities") pod "383bb4a7-0420-4b4d-b283-1a99ee87005c" (UID: "383bb4a7-0420-4b4d-b283-1a99ee87005c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.457369 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383bb4a7-0420-4b4d-b283-1a99ee87005c-kube-api-access-qvbrx" (OuterVolumeSpecName: "kube-api-access-qvbrx") pod "383bb4a7-0420-4b4d-b283-1a99ee87005c" (UID: "383bb4a7-0420-4b4d-b283-1a99ee87005c"). InnerVolumeSpecName "kube-api-access-qvbrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.549228 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvbrx\" (UniqueName: \"kubernetes.io/projected/383bb4a7-0420-4b4d-b283-1a99ee87005c-kube-api-access-qvbrx\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.549265 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.600733 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "383bb4a7-0420-4b4d-b283-1a99ee87005c" (UID: "383bb4a7-0420-4b4d-b283-1a99ee87005c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.650730 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/383bb4a7-0420-4b4d-b283-1a99ee87005c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.877667 4803 generic.go:334] "Generic (PLEG): container finished" podID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerID="1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0" exitCode=0 Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.877716 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sv4k" event={"ID":"383bb4a7-0420-4b4d-b283-1a99ee87005c","Type":"ContainerDied","Data":"1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0"} Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.877732 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6sv4k" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.877747 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6sv4k" event={"ID":"383bb4a7-0420-4b4d-b283-1a99ee87005c","Type":"ContainerDied","Data":"ecff8b3965a8011daf517b9737e557e31669f41c06d646b0901ee6b568aecbd0"} Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.877769 4803 scope.go:117] "RemoveContainer" containerID="1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.903603 4803 scope.go:117] "RemoveContainer" containerID="25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.917773 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6sv4k"] Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.927421 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6sv4k"] Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.935928 4803 scope.go:117] "RemoveContainer" containerID="177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.972657 4803 scope.go:117] "RemoveContainer" containerID="1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0" Mar 20 17:55:27 crc kubenswrapper[4803]: E0320 17:55:27.973010 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0\": container with ID starting with 1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0 not found: ID does not exist" containerID="1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.973044 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0"} err="failed to get container status \"1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0\": rpc error: code = NotFound desc = could not find container \"1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0\": container with ID starting with 1c2a344172702501c78cd8c92eda6d54c4c3aa6ad971276b299a91bf0cd0e3f0 not found: ID does not exist" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.973065 4803 scope.go:117] "RemoveContainer" containerID="25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9" Mar 20 17:55:27 crc kubenswrapper[4803]: E0320 17:55:27.973365 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9\": container with ID starting with 25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9 not found: ID does not exist" containerID="25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.973397 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9"} err="failed to get container status \"25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9\": rpc error: code = NotFound desc = could not find container \"25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9\": container with ID starting with 25b71e331c898a435cc0d93e8f1131708fb2d68185e95d0435dd5ab5b32564f9 not found: ID does not exist" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.973409 4803 scope.go:117] "RemoveContainer" containerID="177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980" Mar 20 17:55:27 crc kubenswrapper[4803]: E0320 17:55:27.973852 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980\": container with ID starting with 177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980 not found: ID does not exist" containerID="177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980" Mar 20 17:55:27 crc kubenswrapper[4803]: I0320 17:55:27.973895 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980"} err="failed to get container status \"177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980\": rpc error: code = NotFound desc = could not find container \"177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980\": container with ID starting with 177eab12e738e0f999d88e44ae2686cd779085fb339e4f1508f80a1ed5f25980 not found: ID does not exist" Mar 20 17:55:28 crc kubenswrapper[4803]: I0320 17:55:28.858391 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" path="/var/lib/kubelet/pods/383bb4a7-0420-4b4d-b283-1a99ee87005c/volumes" Mar 20 17:55:37 crc kubenswrapper[4803]: I0320 17:55:37.848313 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:55:37 crc kubenswrapper[4803]: E0320 17:55:37.849227 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:55:48 crc kubenswrapper[4803]: I0320 17:55:48.847943 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:55:48 crc kubenswrapper[4803]: E0320 17:55:48.848738 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.166639 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567156-zdnsv"] Mar 20 17:56:00 crc kubenswrapper[4803]: E0320 17:56:00.167588 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="extract-utilities" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.167601 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="extract-utilities" Mar 20 17:56:00 crc kubenswrapper[4803]: E0320 17:56:00.167617 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="extract-content" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.167623 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="extract-content" Mar 20 17:56:00 crc kubenswrapper[4803]: E0320 17:56:00.167631 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="registry-server" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.167637 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="registry-server" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.167809 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="383bb4a7-0420-4b4d-b283-1a99ee87005c" containerName="registry-server" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.168402 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-zdnsv" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.178372 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.178883 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.189000 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-zdnsv"] Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.189747 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.303068 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9pfh\" (UniqueName: \"kubernetes.io/projected/2646be0c-2337-420b-8443-eead2fe04663-kube-api-access-d9pfh\") pod \"auto-csr-approver-29567156-zdnsv\" (UID: \"2646be0c-2337-420b-8443-eead2fe04663\") " pod="openshift-infra/auto-csr-approver-29567156-zdnsv" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.405330 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9pfh\" (UniqueName: \"kubernetes.io/projected/2646be0c-2337-420b-8443-eead2fe04663-kube-api-access-d9pfh\") pod \"auto-csr-approver-29567156-zdnsv\" (UID: \"2646be0c-2337-420b-8443-eead2fe04663\") " pod="openshift-infra/auto-csr-approver-29567156-zdnsv" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.423314 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9pfh\" (UniqueName: \"kubernetes.io/projected/2646be0c-2337-420b-8443-eead2fe04663-kube-api-access-d9pfh\") pod \"auto-csr-approver-29567156-zdnsv\" (UID: \"2646be0c-2337-420b-8443-eead2fe04663\") " pod="openshift-infra/auto-csr-approver-29567156-zdnsv" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.491903 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-zdnsv" Mar 20 17:56:00 crc kubenswrapper[4803]: I0320 17:56:00.939175 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-zdnsv"] Mar 20 17:56:01 crc kubenswrapper[4803]: I0320 17:56:01.206579 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-zdnsv" event={"ID":"2646be0c-2337-420b-8443-eead2fe04663","Type":"ContainerStarted","Data":"886c373c6e67fa3dbc5b619168b65ebd439f02ed2f8e4e91d39dbf3bc417bdf7"} Mar 20 17:56:03 crc kubenswrapper[4803]: I0320 17:56:03.224876 4803 generic.go:334] "Generic (PLEG): container finished" podID="2646be0c-2337-420b-8443-eead2fe04663" containerID="f2532284a06b6a23214030e3cdd8b9140783bb8ac7ecc77e91b2bb77c833c5f1" exitCode=0 Mar 20 17:56:03 crc kubenswrapper[4803]: I0320 17:56:03.224947 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-zdnsv" event={"ID":"2646be0c-2337-420b-8443-eead2fe04663","Type":"ContainerDied","Data":"f2532284a06b6a23214030e3cdd8b9140783bb8ac7ecc77e91b2bb77c833c5f1"} Mar 20 17:56:03 crc kubenswrapper[4803]: I0320 17:56:03.848103 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:56:03 crc kubenswrapper[4803]: E0320 17:56:03.848329 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:56:04 crc kubenswrapper[4803]: I0320 17:56:04.623452 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-zdnsv" Mar 20 17:56:04 crc kubenswrapper[4803]: I0320 17:56:04.684752 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9pfh\" (UniqueName: \"kubernetes.io/projected/2646be0c-2337-420b-8443-eead2fe04663-kube-api-access-d9pfh\") pod \"2646be0c-2337-420b-8443-eead2fe04663\" (UID: \"2646be0c-2337-420b-8443-eead2fe04663\") " Mar 20 17:56:04 crc kubenswrapper[4803]: I0320 17:56:04.693174 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2646be0c-2337-420b-8443-eead2fe04663-kube-api-access-d9pfh" (OuterVolumeSpecName: "kube-api-access-d9pfh") pod "2646be0c-2337-420b-8443-eead2fe04663" (UID: "2646be0c-2337-420b-8443-eead2fe04663"). InnerVolumeSpecName "kube-api-access-d9pfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:04 crc kubenswrapper[4803]: I0320 17:56:04.788073 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9pfh\" (UniqueName: \"kubernetes.io/projected/2646be0c-2337-420b-8443-eead2fe04663-kube-api-access-d9pfh\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:05 crc kubenswrapper[4803]: I0320 17:56:05.256880 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567156-zdnsv" event={"ID":"2646be0c-2337-420b-8443-eead2fe04663","Type":"ContainerDied","Data":"886c373c6e67fa3dbc5b619168b65ebd439f02ed2f8e4e91d39dbf3bc417bdf7"} Mar 20 17:56:05 crc kubenswrapper[4803]: I0320 17:56:05.257384 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="886c373c6e67fa3dbc5b619168b65ebd439f02ed2f8e4e91d39dbf3bc417bdf7" Mar 20 17:56:05 crc kubenswrapper[4803]: I0320 17:56:05.257206 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567156-zdnsv" Mar 20 17:56:05 crc kubenswrapper[4803]: I0320 17:56:05.690817 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-b897s"] Mar 20 17:56:05 crc kubenswrapper[4803]: I0320 17:56:05.698832 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567150-b897s"] Mar 20 17:56:06 crc kubenswrapper[4803]: I0320 17:56:06.873636 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3f88542-c200-48c2-a14e-ced4a4f7be3d" path="/var/lib/kubelet/pods/f3f88542-c200-48c2-a14e-ced4a4f7be3d/volumes" Mar 20 17:56:17 crc kubenswrapper[4803]: I0320 17:56:17.848497 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:56:17 crc kubenswrapper[4803]: E0320 17:56:17.849859 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:56:24 crc kubenswrapper[4803]: I0320 17:56:24.102215 4803 scope.go:117] "RemoveContainer" containerID="07a5ad5600f6f2b247631623210160a6db220194a247797795dd807a2698b0d5" Mar 20 17:56:32 crc kubenswrapper[4803]: I0320 17:56:32.848361 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:56:32 crc kubenswrapper[4803]: E0320 17:56:32.848928 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:56:40 crc kubenswrapper[4803]: I0320 17:56:40.635556 4803 generic.go:334] "Generic (PLEG): container finished" podID="0b06d67e-e61d-499b-bce7-41b3f0b3b509" containerID="07d21beeecd50b1e7e7ac52340f6a0e8c66631e7f46f9ebc2e7d6fa4e36504a6" exitCode=0 Mar 20 17:56:40 crc kubenswrapper[4803]: I0320 17:56:40.635642 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" event={"ID":"0b06d67e-e61d-499b-bce7-41b3f0b3b509","Type":"ContainerDied","Data":"07d21beeecd50b1e7e7ac52340f6a0e8c66631e7f46f9ebc2e7d6fa4e36504a6"} Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.039512 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.144478 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-0\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.144622 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6bj7\" (UniqueName: \"kubernetes.io/projected/0b06d67e-e61d-499b-bce7-41b3f0b3b509-kube-api-access-s6bj7\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.144762 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-ssh-key-openstack-edpm-ipam\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.144936 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-extra-config-0\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.145061 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-1\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.145100 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-combined-ca-bundle\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.145144 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-0\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.145177 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-3\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.145225 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-1\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.145311 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-2\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.145371 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-inventory\") pod \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\" (UID: \"0b06d67e-e61d-499b-bce7-41b3f0b3b509\") " Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.151580 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b06d67e-e61d-499b-bce7-41b3f0b3b509-kube-api-access-s6bj7" (OuterVolumeSpecName: "kube-api-access-s6bj7") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "kube-api-access-s6bj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.152053 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.175494 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.176177 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.179054 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.180021 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.182035 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-inventory" (OuterVolumeSpecName: "inventory") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.190578 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.197701 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.198307 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.200704 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b06d67e-e61d-499b-bce7-41b3f0b3b509" (UID: "0b06d67e-e61d-499b-bce7-41b3f0b3b509"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.248719 4803 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.248831 4803 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.248888 4803 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.248947 4803 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.249007 4803 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.249076 4803 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.249149 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.249206 4803 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.249266 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6bj7\" (UniqueName: \"kubernetes.io/projected/0b06d67e-e61d-499b-bce7-41b3f0b3b509-kube-api-access-s6bj7\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.249324 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b06d67e-e61d-499b-bce7-41b3f0b3b509-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.249385 4803 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/0b06d67e-e61d-499b-bce7-41b3f0b3b509-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.657566 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" event={"ID":"0b06d67e-e61d-499b-bce7-41b3f0b3b509","Type":"ContainerDied","Data":"f1cba03ca6f6c71ecabbfedff142299174336e1a73d04bfdb9cd1ac1e8a115aa"} Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.657643 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1cba03ca6f6c71ecabbfedff142299174336e1a73d04bfdb9cd1ac1e8a115aa" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.657656 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-9nrx2" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.779863 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w"] Mar 20 17:56:42 crc kubenswrapper[4803]: E0320 17:56:42.780626 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2646be0c-2337-420b-8443-eead2fe04663" containerName="oc" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.780724 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="2646be0c-2337-420b-8443-eead2fe04663" containerName="oc" Mar 20 17:56:42 crc kubenswrapper[4803]: E0320 17:56:42.780847 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b06d67e-e61d-499b-bce7-41b3f0b3b509" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.780924 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b06d67e-e61d-499b-bce7-41b3f0b3b509" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.781297 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="2646be0c-2337-420b-8443-eead2fe04663" containerName="oc" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.781406 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b06d67e-e61d-499b-bce7-41b3f0b3b509" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.782407 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.786312 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-pmvzt" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.786634 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.786858 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.787038 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.788542 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.792360 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w"] Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.863989 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27nxr\" (UniqueName: \"kubernetes.io/projected/149f9011-aff3-4d4a-ac40-d1325e1bdad0-kube-api-access-27nxr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.864319 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.864349 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.864399 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.864456 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.864511 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.864587 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.966357 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.966443 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.966551 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27nxr\" (UniqueName: \"kubernetes.io/projected/149f9011-aff3-4d4a-ac40-d1325e1bdad0-kube-api-access-27nxr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.966612 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.966649 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.966709 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.966788 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.973817 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.973939 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.974745 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.981386 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.981654 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.982653 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:42 crc kubenswrapper[4803]: I0320 17:56:42.992102 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27nxr\" (UniqueName: \"kubernetes.io/projected/149f9011-aff3-4d4a-ac40-d1325e1bdad0-kube-api-access-27nxr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:43 crc kubenswrapper[4803]: I0320 17:56:43.104621 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:56:43 crc kubenswrapper[4803]: I0320 17:56:43.666139 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w"] Mar 20 17:56:44 crc kubenswrapper[4803]: I0320 17:56:44.678008 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" event={"ID":"149f9011-aff3-4d4a-ac40-d1325e1bdad0","Type":"ContainerStarted","Data":"cb45fa394bc5be01107fc3da088cc63034e8ccfcdb1b2ea109384f6ef3983b1d"} Mar 20 17:56:44 crc kubenswrapper[4803]: I0320 17:56:44.678318 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" event={"ID":"149f9011-aff3-4d4a-ac40-d1325e1bdad0","Type":"ContainerStarted","Data":"3b1d6a1ba48a00ed7e675e0f65a2c713faaa64599f95bb4d7c01d77f26c791ba"} Mar 20 17:56:44 crc kubenswrapper[4803]: I0320 17:56:44.696985 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" podStartSLOduration=2.069135437 podStartE2EDuration="2.696963902s" podCreationTimestamp="2026-03-20 17:56:42 +0000 UTC" firstStartedPulling="2026-03-20 17:56:43.672255535 +0000 UTC m=+2413.583847605" lastFinishedPulling="2026-03-20 17:56:44.30008396 +0000 UTC m=+2414.211676070" observedRunningTime="2026-03-20 17:56:44.694621705 +0000 UTC m=+2414.606213805" watchObservedRunningTime="2026-03-20 17:56:44.696963902 +0000 UTC m=+2414.608555982" Mar 20 17:56:47 crc kubenswrapper[4803]: I0320 17:56:47.848100 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:56:47 crc kubenswrapper[4803]: E0320 17:56:47.849388 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:57:02 crc kubenswrapper[4803]: I0320 17:57:02.848397 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:57:02 crc kubenswrapper[4803]: E0320 17:57:02.849439 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:57:13 crc kubenswrapper[4803]: I0320 17:57:13.849236 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:57:13 crc kubenswrapper[4803]: E0320 17:57:13.850406 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:57:27 crc kubenswrapper[4803]: I0320 17:57:27.849776 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:57:27 crc kubenswrapper[4803]: E0320 17:57:27.851245 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:57:40 crc kubenswrapper[4803]: I0320 17:57:40.862282 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:57:40 crc kubenswrapper[4803]: E0320 17:57:40.863333 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:57:55 crc kubenswrapper[4803]: I0320 17:57:55.848797 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:57:55 crc kubenswrapper[4803]: E0320 17:57:55.849960 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.175499 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567158-fwcck"] Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.180047 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-fwcck" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.185301 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.185710 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.186136 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.196440 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-fwcck"] Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.343400 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmpb\" (UniqueName: \"kubernetes.io/projected/914f86f5-acbe-44ab-91fe-64f51ecade8c-kube-api-access-ldmpb\") pod \"auto-csr-approver-29567158-fwcck\" (UID: \"914f86f5-acbe-44ab-91fe-64f51ecade8c\") " pod="openshift-infra/auto-csr-approver-29567158-fwcck" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.444564 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmpb\" (UniqueName: \"kubernetes.io/projected/914f86f5-acbe-44ab-91fe-64f51ecade8c-kube-api-access-ldmpb\") pod \"auto-csr-approver-29567158-fwcck\" (UID: \"914f86f5-acbe-44ab-91fe-64f51ecade8c\") " pod="openshift-infra/auto-csr-approver-29567158-fwcck" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.466253 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmpb\" (UniqueName: \"kubernetes.io/projected/914f86f5-acbe-44ab-91fe-64f51ecade8c-kube-api-access-ldmpb\") pod \"auto-csr-approver-29567158-fwcck\" (UID: \"914f86f5-acbe-44ab-91fe-64f51ecade8c\") " pod="openshift-infra/auto-csr-approver-29567158-fwcck" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.523131 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-fwcck" Mar 20 17:58:00 crc kubenswrapper[4803]: I0320 17:58:00.996917 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-fwcck"] Mar 20 17:58:01 crc kubenswrapper[4803]: W0320 17:58:01.013972 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod914f86f5_acbe_44ab_91fe_64f51ecade8c.slice/crio-429bb9683996528f2aaa222e8733a84432dc17a323624d43bbb6ce27be628d72 WatchSource:0}: Error finding container 429bb9683996528f2aaa222e8733a84432dc17a323624d43bbb6ce27be628d72: Status 404 returned error can't find the container with id 429bb9683996528f2aaa222e8733a84432dc17a323624d43bbb6ce27be628d72 Mar 20 17:58:01 crc kubenswrapper[4803]: I0320 17:58:01.018009 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 17:58:01 crc kubenswrapper[4803]: I0320 17:58:01.518152 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-fwcck" event={"ID":"914f86f5-acbe-44ab-91fe-64f51ecade8c","Type":"ContainerStarted","Data":"429bb9683996528f2aaa222e8733a84432dc17a323624d43bbb6ce27be628d72"} Mar 20 17:58:02 crc kubenswrapper[4803]: I0320 17:58:02.526864 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-fwcck" event={"ID":"914f86f5-acbe-44ab-91fe-64f51ecade8c","Type":"ContainerStarted","Data":"7bf6147199fff9e54736cad781cda74747d329b599569d0afcf59b590e982ef1"} Mar 20 17:58:02 crc kubenswrapper[4803]: I0320 17:58:02.553766 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567158-fwcck" podStartSLOduration=1.422473579 podStartE2EDuration="2.553746298s" podCreationTimestamp="2026-03-20 17:58:00 +0000 UTC" firstStartedPulling="2026-03-20 17:58:01.017748984 +0000 UTC m=+2490.929341054" lastFinishedPulling="2026-03-20 17:58:02.149021703 +0000 UTC m=+2492.060613773" observedRunningTime="2026-03-20 17:58:02.541457117 +0000 UTC m=+2492.453049187" watchObservedRunningTime="2026-03-20 17:58:02.553746298 +0000 UTC m=+2492.465338368" Mar 20 17:58:03 crc kubenswrapper[4803]: I0320 17:58:03.542993 4803 generic.go:334] "Generic (PLEG): container finished" podID="914f86f5-acbe-44ab-91fe-64f51ecade8c" containerID="7bf6147199fff9e54736cad781cda74747d329b599569d0afcf59b590e982ef1" exitCode=0 Mar 20 17:58:03 crc kubenswrapper[4803]: I0320 17:58:03.543055 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-fwcck" event={"ID":"914f86f5-acbe-44ab-91fe-64f51ecade8c","Type":"ContainerDied","Data":"7bf6147199fff9e54736cad781cda74747d329b599569d0afcf59b590e982ef1"} Mar 20 17:58:04 crc kubenswrapper[4803]: I0320 17:58:04.866507 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-fwcck" Mar 20 17:58:04 crc kubenswrapper[4803]: I0320 17:58:04.940018 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldmpb\" (UniqueName: \"kubernetes.io/projected/914f86f5-acbe-44ab-91fe-64f51ecade8c-kube-api-access-ldmpb\") pod \"914f86f5-acbe-44ab-91fe-64f51ecade8c\" (UID: \"914f86f5-acbe-44ab-91fe-64f51ecade8c\") " Mar 20 17:58:04 crc kubenswrapper[4803]: I0320 17:58:04.947945 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914f86f5-acbe-44ab-91fe-64f51ecade8c-kube-api-access-ldmpb" (OuterVolumeSpecName: "kube-api-access-ldmpb") pod "914f86f5-acbe-44ab-91fe-64f51ecade8c" (UID: "914f86f5-acbe-44ab-91fe-64f51ecade8c"). InnerVolumeSpecName "kube-api-access-ldmpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:58:05 crc kubenswrapper[4803]: I0320 17:58:05.044118 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldmpb\" (UniqueName: \"kubernetes.io/projected/914f86f5-acbe-44ab-91fe-64f51ecade8c-kube-api-access-ldmpb\") on node \"crc\" DevicePath \"\"" Mar 20 17:58:05 crc kubenswrapper[4803]: I0320 17:58:05.561833 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567158-fwcck" event={"ID":"914f86f5-acbe-44ab-91fe-64f51ecade8c","Type":"ContainerDied","Data":"429bb9683996528f2aaa222e8733a84432dc17a323624d43bbb6ce27be628d72"} Mar 20 17:58:05 crc kubenswrapper[4803]: I0320 17:58:05.562160 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="429bb9683996528f2aaa222e8733a84432dc17a323624d43bbb6ce27be628d72" Mar 20 17:58:05 crc kubenswrapper[4803]: I0320 17:58:05.562117 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567158-fwcck" Mar 20 17:58:05 crc kubenswrapper[4803]: I0320 17:58:05.624034 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-nn7vb"] Mar 20 17:58:05 crc kubenswrapper[4803]: I0320 17:58:05.631950 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567152-nn7vb"] Mar 20 17:58:06 crc kubenswrapper[4803]: I0320 17:58:06.861441 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="372304a1-9ac8-40a9-95d3-2e3efbc79546" path="/var/lib/kubelet/pods/372304a1-9ac8-40a9-95d3-2e3efbc79546/volumes" Mar 20 17:58:10 crc kubenswrapper[4803]: I0320 17:58:10.861286 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:58:10 crc kubenswrapper[4803]: E0320 17:58:10.862484 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:58:21 crc kubenswrapper[4803]: I0320 17:58:21.849476 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:58:21 crc kubenswrapper[4803]: E0320 17:58:21.850273 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:58:24 crc kubenswrapper[4803]: I0320 17:58:24.254850 4803 scope.go:117] "RemoveContainer" containerID="c7b87214f927b52ece5ab15b4913f0215be32ca881915c61f5cc4cb7a464d1e9" Mar 20 17:58:32 crc kubenswrapper[4803]: I0320 17:58:32.848306 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:58:32 crc kubenswrapper[4803]: E0320 17:58:32.849772 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:58:44 crc kubenswrapper[4803]: I0320 17:58:44.848141 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:58:44 crc kubenswrapper[4803]: E0320 17:58:44.848940 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:58:56 crc kubenswrapper[4803]: I0320 17:58:56.848134 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:58:56 crc kubenswrapper[4803]: E0320 17:58:56.848826 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 17:59:10 crc kubenswrapper[4803]: I0320 17:59:10.865195 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 17:59:11 crc kubenswrapper[4803]: I0320 17:59:11.478080 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"b679b259902a7258f1f12079211ce05fbe7129cf0f66c72791a1fe4ab937b8bf"} Mar 20 17:59:11 crc kubenswrapper[4803]: I0320 17:59:11.480664 4803 generic.go:334] "Generic (PLEG): container finished" podID="149f9011-aff3-4d4a-ac40-d1325e1bdad0" containerID="cb45fa394bc5be01107fc3da088cc63034e8ccfcdb1b2ea109384f6ef3983b1d" exitCode=0 Mar 20 17:59:11 crc kubenswrapper[4803]: I0320 17:59:11.480731 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" event={"ID":"149f9011-aff3-4d4a-ac40-d1325e1bdad0","Type":"ContainerDied","Data":"cb45fa394bc5be01107fc3da088cc63034e8ccfcdb1b2ea109384f6ef3983b1d"} Mar 20 17:59:12 crc kubenswrapper[4803]: I0320 17:59:12.978933 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.051242 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ssh-key-openstack-edpm-ipam\") pod \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.051643 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-inventory\") pod \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.051667 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-telemetry-combined-ca-bundle\") pod \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.051722 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-2\") pod \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.051756 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27nxr\" (UniqueName: \"kubernetes.io/projected/149f9011-aff3-4d4a-ac40-d1325e1bdad0-kube-api-access-27nxr\") pod \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.051809 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-1\") pod \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.051832 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-0\") pod \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\" (UID: \"149f9011-aff3-4d4a-ac40-d1325e1bdad0\") " Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.057106 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149f9011-aff3-4d4a-ac40-d1325e1bdad0-kube-api-access-27nxr" (OuterVolumeSpecName: "kube-api-access-27nxr") pod "149f9011-aff3-4d4a-ac40-d1325e1bdad0" (UID: "149f9011-aff3-4d4a-ac40-d1325e1bdad0"). InnerVolumeSpecName "kube-api-access-27nxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.070633 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "149f9011-aff3-4d4a-ac40-d1325e1bdad0" (UID: "149f9011-aff3-4d4a-ac40-d1325e1bdad0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.080913 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "149f9011-aff3-4d4a-ac40-d1325e1bdad0" (UID: "149f9011-aff3-4d4a-ac40-d1325e1bdad0"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.081849 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "149f9011-aff3-4d4a-ac40-d1325e1bdad0" (UID: "149f9011-aff3-4d4a-ac40-d1325e1bdad0"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.085396 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "149f9011-aff3-4d4a-ac40-d1325e1bdad0" (UID: "149f9011-aff3-4d4a-ac40-d1325e1bdad0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.093701 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "149f9011-aff3-4d4a-ac40-d1325e1bdad0" (UID: "149f9011-aff3-4d4a-ac40-d1325e1bdad0"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.094898 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-inventory" (OuterVolumeSpecName: "inventory") pod "149f9011-aff3-4d4a-ac40-d1325e1bdad0" (UID: "149f9011-aff3-4d4a-ac40-d1325e1bdad0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.154824 4803 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.154862 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27nxr\" (UniqueName: \"kubernetes.io/projected/149f9011-aff3-4d4a-ac40-d1325e1bdad0-kube-api-access-27nxr\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.154876 4803 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.154889 4803 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.154901 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.154914 4803 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.154927 4803 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/149f9011-aff3-4d4a-ac40-d1325e1bdad0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.508888 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" event={"ID":"149f9011-aff3-4d4a-ac40-d1325e1bdad0","Type":"ContainerDied","Data":"3b1d6a1ba48a00ed7e675e0f65a2c713faaa64599f95bb4d7c01d77f26c791ba"} Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.509410 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b1d6a1ba48a00ed7e675e0f65a2c713faaa64599f95bb4d7c01d77f26c791ba" Mar 20 17:59:13 crc kubenswrapper[4803]: I0320 17:59:13.508979 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.181556 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567160-99nnx"] Mar 20 18:00:00 crc kubenswrapper[4803]: E0320 18:00:00.182777 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="914f86f5-acbe-44ab-91fe-64f51ecade8c" containerName="oc" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.182799 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="914f86f5-acbe-44ab-91fe-64f51ecade8c" containerName="oc" Mar 20 18:00:00 crc kubenswrapper[4803]: E0320 18:00:00.182823 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149f9011-aff3-4d4a-ac40-d1325e1bdad0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.182838 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="149f9011-aff3-4d4a-ac40-d1325e1bdad0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.183232 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="149f9011-aff3-4d4a-ac40-d1325e1bdad0" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.183260 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="914f86f5-acbe-44ab-91fe-64f51ecade8c" containerName="oc" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.184630 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-99nnx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.187868 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.190065 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.190566 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.200595 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx"] Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.202475 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.205550 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.205797 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.212539 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-99nnx"] Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.231716 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx"] Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.340634 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8e31a41-e21a-43fe-91ef-8307926fb07a-secret-volume\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.340681 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2xtz\" (UniqueName: \"kubernetes.io/projected/c8e31a41-e21a-43fe-91ef-8307926fb07a-kube-api-access-q2xtz\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.340701 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8e31a41-e21a-43fe-91ef-8307926fb07a-config-volume\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.340911 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdvtv\" (UniqueName: \"kubernetes.io/projected/621c23a1-5636-42c7-a060-bdec3307c552-kube-api-access-hdvtv\") pod \"auto-csr-approver-29567160-99nnx\" (UID: \"621c23a1-5636-42c7-a060-bdec3307c552\") " pod="openshift-infra/auto-csr-approver-29567160-99nnx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.442326 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdvtv\" (UniqueName: \"kubernetes.io/projected/621c23a1-5636-42c7-a060-bdec3307c552-kube-api-access-hdvtv\") pod \"auto-csr-approver-29567160-99nnx\" (UID: \"621c23a1-5636-42c7-a060-bdec3307c552\") " pod="openshift-infra/auto-csr-approver-29567160-99nnx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.442450 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8e31a41-e21a-43fe-91ef-8307926fb07a-secret-volume\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.442489 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2xtz\" (UniqueName: \"kubernetes.io/projected/c8e31a41-e21a-43fe-91ef-8307926fb07a-kube-api-access-q2xtz\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.442515 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8e31a41-e21a-43fe-91ef-8307926fb07a-config-volume\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.443667 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8e31a41-e21a-43fe-91ef-8307926fb07a-config-volume\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.450633 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8e31a41-e21a-43fe-91ef-8307926fb07a-secret-volume\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.466076 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2xtz\" (UniqueName: \"kubernetes.io/projected/c8e31a41-e21a-43fe-91ef-8307926fb07a-kube-api-access-q2xtz\") pod \"collect-profiles-29567160-hpbfx\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.476580 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdvtv\" (UniqueName: \"kubernetes.io/projected/621c23a1-5636-42c7-a060-bdec3307c552-kube-api-access-hdvtv\") pod \"auto-csr-approver-29567160-99nnx\" (UID: \"621c23a1-5636-42c7-a060-bdec3307c552\") " pod="openshift-infra/auto-csr-approver-29567160-99nnx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.556069 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-99nnx" Mar 20 18:00:00 crc kubenswrapper[4803]: I0320 18:00:00.564379 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:01 crc kubenswrapper[4803]: I0320 18:00:01.068280 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-99nnx"] Mar 20 18:00:01 crc kubenswrapper[4803]: I0320 18:00:01.154827 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx"] Mar 20 18:00:01 crc kubenswrapper[4803]: W0320 18:00:01.155296 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8e31a41_e21a_43fe_91ef_8307926fb07a.slice/crio-b40e45aee592cdb7e59aeab27fa88e261a363710aa7a0422d5df1e1a19bfe304 WatchSource:0}: Error finding container b40e45aee592cdb7e59aeab27fa88e261a363710aa7a0422d5df1e1a19bfe304: Status 404 returned error can't find the container with id b40e45aee592cdb7e59aeab27fa88e261a363710aa7a0422d5df1e1a19bfe304 Mar 20 18:00:02 crc kubenswrapper[4803]: I0320 18:00:02.033379 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-99nnx" event={"ID":"621c23a1-5636-42c7-a060-bdec3307c552","Type":"ContainerStarted","Data":"c1a6cef10ce549aee91731fddee5f64f68c4dc86a04d7caf89f086fd1ac7b173"} Mar 20 18:00:02 crc kubenswrapper[4803]: I0320 18:00:02.035282 4803 generic.go:334] "Generic (PLEG): container finished" podID="c8e31a41-e21a-43fe-91ef-8307926fb07a" containerID="696cdd0d3c17809541281913a803588202c78159107ca5ee15bf59839f311e33" exitCode=0 Mar 20 18:00:02 crc kubenswrapper[4803]: I0320 18:00:02.035321 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" event={"ID":"c8e31a41-e21a-43fe-91ef-8307926fb07a","Type":"ContainerDied","Data":"696cdd0d3c17809541281913a803588202c78159107ca5ee15bf59839f311e33"} Mar 20 18:00:02 crc kubenswrapper[4803]: I0320 18:00:02.035343 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" event={"ID":"c8e31a41-e21a-43fe-91ef-8307926fb07a","Type":"ContainerStarted","Data":"b40e45aee592cdb7e59aeab27fa88e261a363710aa7a0422d5df1e1a19bfe304"} Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.436727 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.500376 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2xtz\" (UniqueName: \"kubernetes.io/projected/c8e31a41-e21a-43fe-91ef-8307926fb07a-kube-api-access-q2xtz\") pod \"c8e31a41-e21a-43fe-91ef-8307926fb07a\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.500547 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8e31a41-e21a-43fe-91ef-8307926fb07a-secret-volume\") pod \"c8e31a41-e21a-43fe-91ef-8307926fb07a\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.500695 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8e31a41-e21a-43fe-91ef-8307926fb07a-config-volume\") pod \"c8e31a41-e21a-43fe-91ef-8307926fb07a\" (UID: \"c8e31a41-e21a-43fe-91ef-8307926fb07a\") " Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.502557 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8e31a41-e21a-43fe-91ef-8307926fb07a-config-volume" (OuterVolumeSpecName: "config-volume") pod "c8e31a41-e21a-43fe-91ef-8307926fb07a" (UID: "c8e31a41-e21a-43fe-91ef-8307926fb07a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.511803 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e31a41-e21a-43fe-91ef-8307926fb07a-kube-api-access-q2xtz" (OuterVolumeSpecName: "kube-api-access-q2xtz") pod "c8e31a41-e21a-43fe-91ef-8307926fb07a" (UID: "c8e31a41-e21a-43fe-91ef-8307926fb07a"). InnerVolumeSpecName "kube-api-access-q2xtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.523506 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e31a41-e21a-43fe-91ef-8307926fb07a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c8e31a41-e21a-43fe-91ef-8307926fb07a" (UID: "c8e31a41-e21a-43fe-91ef-8307926fb07a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.602814 4803 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c8e31a41-e21a-43fe-91ef-8307926fb07a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.603135 4803 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8e31a41-e21a-43fe-91ef-8307926fb07a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:03 crc kubenswrapper[4803]: I0320 18:00:03.603148 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2xtz\" (UniqueName: \"kubernetes.io/projected/c8e31a41-e21a-43fe-91ef-8307926fb07a-kube-api-access-q2xtz\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:04 crc kubenswrapper[4803]: I0320 18:00:04.057170 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" event={"ID":"c8e31a41-e21a-43fe-91ef-8307926fb07a","Type":"ContainerDied","Data":"b40e45aee592cdb7e59aeab27fa88e261a363710aa7a0422d5df1e1a19bfe304"} Mar 20 18:00:04 crc kubenswrapper[4803]: I0320 18:00:04.057212 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b40e45aee592cdb7e59aeab27fa88e261a363710aa7a0422d5df1e1a19bfe304" Mar 20 18:00:04 crc kubenswrapper[4803]: I0320 18:00:04.057283 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567160-hpbfx" Mar 20 18:00:04 crc kubenswrapper[4803]: I0320 18:00:04.528559 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg"] Mar 20 18:00:04 crc kubenswrapper[4803]: I0320 18:00:04.545358 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567115-f7sgg"] Mar 20 18:00:04 crc kubenswrapper[4803]: I0320 18:00:04.865057 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a" path="/var/lib/kubelet/pods/ed3d7dfe-46d3-47e0-bae2-0fe4dfed694a/volumes" Mar 20 18:00:06 crc kubenswrapper[4803]: I0320 18:00:06.078169 4803 generic.go:334] "Generic (PLEG): container finished" podID="621c23a1-5636-42c7-a060-bdec3307c552" containerID="3f8d588ef9b7f44fcd0eed192a1ff5614c3c25aae725a16509d9e8f8b962effd" exitCode=0 Mar 20 18:00:06 crc kubenswrapper[4803]: I0320 18:00:06.078277 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-99nnx" event={"ID":"621c23a1-5636-42c7-a060-bdec3307c552","Type":"ContainerDied","Data":"3f8d588ef9b7f44fcd0eed192a1ff5614c3c25aae725a16509d9e8f8b962effd"} Mar 20 18:00:07 crc kubenswrapper[4803]: I0320 18:00:07.461332 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-99nnx" Mar 20 18:00:07 crc kubenswrapper[4803]: I0320 18:00:07.585203 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdvtv\" (UniqueName: \"kubernetes.io/projected/621c23a1-5636-42c7-a060-bdec3307c552-kube-api-access-hdvtv\") pod \"621c23a1-5636-42c7-a060-bdec3307c552\" (UID: \"621c23a1-5636-42c7-a060-bdec3307c552\") " Mar 20 18:00:07 crc kubenswrapper[4803]: I0320 18:00:07.590210 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/621c23a1-5636-42c7-a060-bdec3307c552-kube-api-access-hdvtv" (OuterVolumeSpecName: "kube-api-access-hdvtv") pod "621c23a1-5636-42c7-a060-bdec3307c552" (UID: "621c23a1-5636-42c7-a060-bdec3307c552"). InnerVolumeSpecName "kube-api-access-hdvtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:00:07 crc kubenswrapper[4803]: I0320 18:00:07.687732 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdvtv\" (UniqueName: \"kubernetes.io/projected/621c23a1-5636-42c7-a060-bdec3307c552-kube-api-access-hdvtv\") on node \"crc\" DevicePath \"\"" Mar 20 18:00:08 crc kubenswrapper[4803]: I0320 18:00:08.101658 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567160-99nnx" event={"ID":"621c23a1-5636-42c7-a060-bdec3307c552","Type":"ContainerDied","Data":"c1a6cef10ce549aee91731fddee5f64f68c4dc86a04d7caf89f086fd1ac7b173"} Mar 20 18:00:08 crc kubenswrapper[4803]: I0320 18:00:08.102055 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1a6cef10ce549aee91731fddee5f64f68c4dc86a04d7caf89f086fd1ac7b173" Mar 20 18:00:08 crc kubenswrapper[4803]: I0320 18:00:08.101742 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567160-99nnx" Mar 20 18:00:08 crc kubenswrapper[4803]: I0320 18:00:08.523269 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-b5d9q"] Mar 20 18:00:08 crc kubenswrapper[4803]: I0320 18:00:08.537979 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567154-b5d9q"] Mar 20 18:00:08 crc kubenswrapper[4803]: I0320 18:00:08.857053 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b57f205-75ea-4ba3-899c-611ed863cff7" path="/var/lib/kubelet/pods/3b57f205-75ea-4ba3-899c-611ed863cff7/volumes" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.129991 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:00:14 crc kubenswrapper[4803]: E0320 18:00:14.131063 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="621c23a1-5636-42c7-a060-bdec3307c552" containerName="oc" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.131081 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="621c23a1-5636-42c7-a060-bdec3307c552" containerName="oc" Mar 20 18:00:14 crc kubenswrapper[4803]: E0320 18:00:14.131104 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e31a41-e21a-43fe-91ef-8307926fb07a" containerName="collect-profiles" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.131111 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e31a41-e21a-43fe-91ef-8307926fb07a" containerName="collect-profiles" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.131352 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e31a41-e21a-43fe-91ef-8307926fb07a" containerName="collect-profiles" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.131384 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="621c23a1-5636-42c7-a060-bdec3307c552" containerName="oc" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.132666 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.135387 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.135413 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.135681 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.137132 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m64r5" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.143570 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.214624 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.215011 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.215039 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.215058 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.215082 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.215114 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.215140 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-config-data\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.215450 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz4gs\" (UniqueName: \"kubernetes.io/projected/6ba5f719-6967-43a1-b544-c27baf20c15b-kube-api-access-fz4gs\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.215611 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317182 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317245 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317280 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317325 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317369 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317399 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-config-data\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317451 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4gs\" (UniqueName: \"kubernetes.io/projected/6ba5f719-6967-43a1-b544-c27baf20c15b-kube-api-access-fz4gs\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317484 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317602 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.317972 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.318079 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.318721 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.319362 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-config-data\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.319501 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.325295 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.328513 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.331399 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.336667 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4gs\" (UniqueName: \"kubernetes.io/projected/6ba5f719-6967-43a1-b544-c27baf20c15b-kube-api-access-fz4gs\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.360942 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.456478 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:00:14 crc kubenswrapper[4803]: I0320 18:00:14.923719 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 18:00:14 crc kubenswrapper[4803]: W0320 18:00:14.935067 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba5f719_6967_43a1_b544_c27baf20c15b.slice/crio-94d746cf621473edf2bd87e8f719caf5539f1373fd13d46f10249bc600246e64 WatchSource:0}: Error finding container 94d746cf621473edf2bd87e8f719caf5539f1373fd13d46f10249bc600246e64: Status 404 returned error can't find the container with id 94d746cf621473edf2bd87e8f719caf5539f1373fd13d46f10249bc600246e64 Mar 20 18:00:15 crc kubenswrapper[4803]: I0320 18:00:15.172872 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6ba5f719-6967-43a1-b544-c27baf20c15b","Type":"ContainerStarted","Data":"94d746cf621473edf2bd87e8f719caf5539f1373fd13d46f10249bc600246e64"} Mar 20 18:00:24 crc kubenswrapper[4803]: I0320 18:00:24.423196 4803 scope.go:117] "RemoveContainer" containerID="acc2f987afb39df4999db0d2666d8d9e52c97902961a5c4f37dedb9b0bdb043e" Mar 20 18:00:39 crc kubenswrapper[4803]: I0320 18:00:39.845323 4803 scope.go:117] "RemoveContainer" containerID="0959dc77a9f2d2aa346835d08a9ec8d7137fe0443d04b6706d0d3fe595e5af9d" Mar 20 18:00:39 crc kubenswrapper[4803]: E0320 18:00:39.934313 4803 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 20 18:00:39 crc kubenswrapper[4803]: E0320 18:00:39.934468 4803 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fz4gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(6ba5f719-6967-43a1-b544-c27baf20c15b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 18:00:39 crc kubenswrapper[4803]: E0320 18:00:39.935850 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="6ba5f719-6967-43a1-b544-c27baf20c15b" Mar 20 18:00:40 crc kubenswrapper[4803]: E0320 18:00:40.414916 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="6ba5f719-6967-43a1-b544-c27baf20c15b" Mar 20 18:00:55 crc kubenswrapper[4803]: I0320 18:00:55.469848 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 18:00:57 crc kubenswrapper[4803]: I0320 18:00:57.599861 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6ba5f719-6967-43a1-b544-c27baf20c15b","Type":"ContainerStarted","Data":"f0662d5128585c2f717e7884b477579f75ef9bd1efa8d3800000e72b095ea12e"} Mar 20 18:00:57 crc kubenswrapper[4803]: I0320 18:00:57.625417 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.097353383 podStartE2EDuration="44.625395661s" podCreationTimestamp="2026-03-20 18:00:13 +0000 UTC" firstStartedPulling="2026-03-20 18:00:14.938742845 +0000 UTC m=+2624.850334915" lastFinishedPulling="2026-03-20 18:00:55.466785123 +0000 UTC m=+2665.378377193" observedRunningTime="2026-03-20 18:00:57.622874449 +0000 UTC m=+2667.534466549" watchObservedRunningTime="2026-03-20 18:00:57.625395661 +0000 UTC m=+2667.536987741" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.208045 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29567161-685pt"] Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.209578 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.223355 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567161-685pt"] Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.394542 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-fernet-keys\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.394624 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-combined-ca-bundle\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.395046 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbgq8\" (UniqueName: \"kubernetes.io/projected/5b706f4a-4386-4911-a201-5cbf5e7bd916-kube-api-access-sbgq8\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.395169 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-config-data\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.496793 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbgq8\" (UniqueName: \"kubernetes.io/projected/5b706f4a-4386-4911-a201-5cbf5e7bd916-kube-api-access-sbgq8\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.497115 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-config-data\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.497198 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-fernet-keys\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.497275 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-combined-ca-bundle\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.505422 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-combined-ca-bundle\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.517475 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-fernet-keys\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.518334 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-config-data\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.524856 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbgq8\" (UniqueName: \"kubernetes.io/projected/5b706f4a-4386-4911-a201-5cbf5e7bd916-kube-api-access-sbgq8\") pod \"keystone-cron-29567161-685pt\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:00 crc kubenswrapper[4803]: I0320 18:01:00.544497 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:01 crc kubenswrapper[4803]: I0320 18:01:01.023569 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29567161-685pt"] Mar 20 18:01:01 crc kubenswrapper[4803]: I0320 18:01:01.647298 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-685pt" event={"ID":"5b706f4a-4386-4911-a201-5cbf5e7bd916","Type":"ContainerStarted","Data":"89f2b123f1fe4bc0ef162164d9dff8a30aa5e0a252693ab9a7347816f79e2e0b"} Mar 20 18:01:01 crc kubenswrapper[4803]: I0320 18:01:01.647671 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-685pt" event={"ID":"5b706f4a-4386-4911-a201-5cbf5e7bd916","Type":"ContainerStarted","Data":"c51c831b0cd89a74733f88d772246bf39ea732a735318c9fbfc7333c4bb8efec"} Mar 20 18:01:03 crc kubenswrapper[4803]: I0320 18:01:03.665740 4803 generic.go:334] "Generic (PLEG): container finished" podID="5b706f4a-4386-4911-a201-5cbf5e7bd916" containerID="89f2b123f1fe4bc0ef162164d9dff8a30aa5e0a252693ab9a7347816f79e2e0b" exitCode=0 Mar 20 18:01:03 crc kubenswrapper[4803]: I0320 18:01:03.665807 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-685pt" event={"ID":"5b706f4a-4386-4911-a201-5cbf5e7bd916","Type":"ContainerDied","Data":"89f2b123f1fe4bc0ef162164d9dff8a30aa5e0a252693ab9a7347816f79e2e0b"} Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.051343 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.192881 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-combined-ca-bundle\") pod \"5b706f4a-4386-4911-a201-5cbf5e7bd916\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.193108 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-fernet-keys\") pod \"5b706f4a-4386-4911-a201-5cbf5e7bd916\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.193169 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbgq8\" (UniqueName: \"kubernetes.io/projected/5b706f4a-4386-4911-a201-5cbf5e7bd916-kube-api-access-sbgq8\") pod \"5b706f4a-4386-4911-a201-5cbf5e7bd916\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.193269 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-config-data\") pod \"5b706f4a-4386-4911-a201-5cbf5e7bd916\" (UID: \"5b706f4a-4386-4911-a201-5cbf5e7bd916\") " Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.199185 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5b706f4a-4386-4911-a201-5cbf5e7bd916" (UID: "5b706f4a-4386-4911-a201-5cbf5e7bd916"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.206769 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b706f4a-4386-4911-a201-5cbf5e7bd916-kube-api-access-sbgq8" (OuterVolumeSpecName: "kube-api-access-sbgq8") pod "5b706f4a-4386-4911-a201-5cbf5e7bd916" (UID: "5b706f4a-4386-4911-a201-5cbf5e7bd916"). InnerVolumeSpecName "kube-api-access-sbgq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.222137 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b706f4a-4386-4911-a201-5cbf5e7bd916" (UID: "5b706f4a-4386-4911-a201-5cbf5e7bd916"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.260164 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-config-data" (OuterVolumeSpecName: "config-data") pod "5b706f4a-4386-4911-a201-5cbf5e7bd916" (UID: "5b706f4a-4386-4911-a201-5cbf5e7bd916"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.296104 4803 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.296142 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbgq8\" (UniqueName: \"kubernetes.io/projected/5b706f4a-4386-4911-a201-5cbf5e7bd916-kube-api-access-sbgq8\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.296159 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.296173 4803 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b706f4a-4386-4911-a201-5cbf5e7bd916-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.687027 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29567161-685pt" event={"ID":"5b706f4a-4386-4911-a201-5cbf5e7bd916","Type":"ContainerDied","Data":"c51c831b0cd89a74733f88d772246bf39ea732a735318c9fbfc7333c4bb8efec"} Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.687069 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c51c831b0cd89a74733f88d772246bf39ea732a735318c9fbfc7333c4bb8efec" Mar 20 18:01:05 crc kubenswrapper[4803]: I0320 18:01:05.687119 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29567161-685pt" Mar 20 18:01:38 crc kubenswrapper[4803]: I0320 18:01:38.246361 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:01:38 crc kubenswrapper[4803]: I0320 18:01:38.246923 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.156269 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567162-nrcjv"] Mar 20 18:02:00 crc kubenswrapper[4803]: E0320 18:02:00.157431 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b706f4a-4386-4911-a201-5cbf5e7bd916" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.157451 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b706f4a-4386-4911-a201-5cbf5e7bd916" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.157788 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b706f4a-4386-4911-a201-5cbf5e7bd916" containerName="keystone-cron" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.158655 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-nrcjv" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.161694 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.161909 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.162070 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.166549 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-nrcjv"] Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.270840 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5thml\" (UniqueName: \"kubernetes.io/projected/c2cd5358-529a-401b-ae71-b30f67d7e9c0-kube-api-access-5thml\") pod \"auto-csr-approver-29567162-nrcjv\" (UID: \"c2cd5358-529a-401b-ae71-b30f67d7e9c0\") " pod="openshift-infra/auto-csr-approver-29567162-nrcjv" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.373156 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5thml\" (UniqueName: \"kubernetes.io/projected/c2cd5358-529a-401b-ae71-b30f67d7e9c0-kube-api-access-5thml\") pod \"auto-csr-approver-29567162-nrcjv\" (UID: \"c2cd5358-529a-401b-ae71-b30f67d7e9c0\") " pod="openshift-infra/auto-csr-approver-29567162-nrcjv" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.397959 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5thml\" (UniqueName: \"kubernetes.io/projected/c2cd5358-529a-401b-ae71-b30f67d7e9c0-kube-api-access-5thml\") pod \"auto-csr-approver-29567162-nrcjv\" (UID: \"c2cd5358-529a-401b-ae71-b30f67d7e9c0\") " pod="openshift-infra/auto-csr-approver-29567162-nrcjv" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.486207 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-nrcjv" Mar 20 18:02:00 crc kubenswrapper[4803]: I0320 18:02:00.956865 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-nrcjv"] Mar 20 18:02:01 crc kubenswrapper[4803]: I0320 18:02:01.333892 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-nrcjv" event={"ID":"c2cd5358-529a-401b-ae71-b30f67d7e9c0","Type":"ContainerStarted","Data":"99edd7d1520c32aaafe4afd27c5f75d1e76a2bc7ffd0853d1fa581cf8d456d51"} Mar 20 18:02:03 crc kubenswrapper[4803]: I0320 18:02:03.359466 4803 generic.go:334] "Generic (PLEG): container finished" podID="c2cd5358-529a-401b-ae71-b30f67d7e9c0" containerID="70ebf02357aa4015af3f7526f6875647c07cacb1cd69621c18d337acfa4b8577" exitCode=0 Mar 20 18:02:03 crc kubenswrapper[4803]: I0320 18:02:03.359640 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-nrcjv" event={"ID":"c2cd5358-529a-401b-ae71-b30f67d7e9c0","Type":"ContainerDied","Data":"70ebf02357aa4015af3f7526f6875647c07cacb1cd69621c18d337acfa4b8577"} Mar 20 18:02:04 crc kubenswrapper[4803]: I0320 18:02:04.841737 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-nrcjv" Mar 20 18:02:04 crc kubenswrapper[4803]: I0320 18:02:04.994054 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5thml\" (UniqueName: \"kubernetes.io/projected/c2cd5358-529a-401b-ae71-b30f67d7e9c0-kube-api-access-5thml\") pod \"c2cd5358-529a-401b-ae71-b30f67d7e9c0\" (UID: \"c2cd5358-529a-401b-ae71-b30f67d7e9c0\") " Mar 20 18:02:05 crc kubenswrapper[4803]: I0320 18:02:05.000907 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cd5358-529a-401b-ae71-b30f67d7e9c0-kube-api-access-5thml" (OuterVolumeSpecName: "kube-api-access-5thml") pod "c2cd5358-529a-401b-ae71-b30f67d7e9c0" (UID: "c2cd5358-529a-401b-ae71-b30f67d7e9c0"). InnerVolumeSpecName "kube-api-access-5thml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:02:05 crc kubenswrapper[4803]: I0320 18:02:05.097653 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5thml\" (UniqueName: \"kubernetes.io/projected/c2cd5358-529a-401b-ae71-b30f67d7e9c0-kube-api-access-5thml\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:05 crc kubenswrapper[4803]: I0320 18:02:05.380859 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567162-nrcjv" event={"ID":"c2cd5358-529a-401b-ae71-b30f67d7e9c0","Type":"ContainerDied","Data":"99edd7d1520c32aaafe4afd27c5f75d1e76a2bc7ffd0853d1fa581cf8d456d51"} Mar 20 18:02:05 crc kubenswrapper[4803]: I0320 18:02:05.380914 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99edd7d1520c32aaafe4afd27c5f75d1e76a2bc7ffd0853d1fa581cf8d456d51" Mar 20 18:02:05 crc kubenswrapper[4803]: I0320 18:02:05.380934 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567162-nrcjv" Mar 20 18:02:05 crc kubenswrapper[4803]: I0320 18:02:05.955023 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-zdnsv"] Mar 20 18:02:05 crc kubenswrapper[4803]: I0320 18:02:05.971606 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567156-zdnsv"] Mar 20 18:02:06 crc kubenswrapper[4803]: I0320 18:02:06.864408 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2646be0c-2337-420b-8443-eead2fe04663" path="/var/lib/kubelet/pods/2646be0c-2337-420b-8443-eead2fe04663/volumes" Mar 20 18:02:08 crc kubenswrapper[4803]: I0320 18:02:08.246082 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:02:08 crc kubenswrapper[4803]: I0320 18:02:08.246481 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.246462 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.247043 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.247099 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.247908 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b679b259902a7258f1f12079211ce05fbe7129cf0f66c72791a1fe4ab937b8bf"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.247976 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://b679b259902a7258f1f12079211ce05fbe7129cf0f66c72791a1fe4ab937b8bf" gracePeriod=600 Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.697907 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="b679b259902a7258f1f12079211ce05fbe7129cf0f66c72791a1fe4ab937b8bf" exitCode=0 Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.698029 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"b679b259902a7258f1f12079211ce05fbe7129cf0f66c72791a1fe4ab937b8bf"} Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.698380 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4"} Mar 20 18:02:38 crc kubenswrapper[4803]: I0320 18:02:38.698424 4803 scope.go:117] "RemoveContainer" containerID="916ae54c8b8f3d3dbe4c9ae5fb9ed153888db1910b331a16db2f512763e4f50d" Mar 20 18:02:40 crc kubenswrapper[4803]: I0320 18:02:40.006167 4803 scope.go:117] "RemoveContainer" containerID="f2532284a06b6a23214030e3cdd8b9140783bb8ac7ecc77e91b2bb77c833c5f1" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.767326 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c4p6v"] Mar 20 18:02:41 crc kubenswrapper[4803]: E0320 18:02:41.768411 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cd5358-529a-401b-ae71-b30f67d7e9c0" containerName="oc" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.768427 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cd5358-529a-401b-ae71-b30f67d7e9c0" containerName="oc" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.768682 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cd5358-529a-401b-ae71-b30f67d7e9c0" containerName="oc" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.770560 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.778985 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c4p6v"] Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.865389 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-utilities\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.865446 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-catalog-content\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.865553 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/91c783e6-814e-4bb7-8835-178e4fae327f-kube-api-access-kv8x2\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.966910 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-utilities\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.966990 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-catalog-content\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.967161 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/91c783e6-814e-4bb7-8835-178e4fae327f-kube-api-access-kv8x2\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.967468 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-utilities\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.968236 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-catalog-content\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:41 crc kubenswrapper[4803]: I0320 18:02:41.987994 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/91c783e6-814e-4bb7-8835-178e4fae327f-kube-api-access-kv8x2\") pod \"community-operators-c4p6v\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:42 crc kubenswrapper[4803]: I0320 18:02:42.093896 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:42 crc kubenswrapper[4803]: I0320 18:02:42.586202 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c4p6v"] Mar 20 18:02:42 crc kubenswrapper[4803]: I0320 18:02:42.745307 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4p6v" event={"ID":"91c783e6-814e-4bb7-8835-178e4fae327f","Type":"ContainerStarted","Data":"78454380e640a12283492277bd8ea8644722ff073ee9fe07a01336c37a385ae2"} Mar 20 18:02:43 crc kubenswrapper[4803]: I0320 18:02:43.754232 4803 generic.go:334] "Generic (PLEG): container finished" podID="91c783e6-814e-4bb7-8835-178e4fae327f" containerID="c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c" exitCode=0 Mar 20 18:02:43 crc kubenswrapper[4803]: I0320 18:02:43.754425 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4p6v" event={"ID":"91c783e6-814e-4bb7-8835-178e4fae327f","Type":"ContainerDied","Data":"c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c"} Mar 20 18:02:44 crc kubenswrapper[4803]: I0320 18:02:44.765955 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4p6v" event={"ID":"91c783e6-814e-4bb7-8835-178e4fae327f","Type":"ContainerStarted","Data":"ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce"} Mar 20 18:02:45 crc kubenswrapper[4803]: I0320 18:02:45.775572 4803 generic.go:334] "Generic (PLEG): container finished" podID="91c783e6-814e-4bb7-8835-178e4fae327f" containerID="ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce" exitCode=0 Mar 20 18:02:45 crc kubenswrapper[4803]: I0320 18:02:45.775661 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4p6v" event={"ID":"91c783e6-814e-4bb7-8835-178e4fae327f","Type":"ContainerDied","Data":"ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce"} Mar 20 18:02:46 crc kubenswrapper[4803]: I0320 18:02:46.787462 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4p6v" event={"ID":"91c783e6-814e-4bb7-8835-178e4fae327f","Type":"ContainerStarted","Data":"375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca"} Mar 20 18:02:46 crc kubenswrapper[4803]: I0320 18:02:46.810237 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c4p6v" podStartSLOduration=3.285657266 podStartE2EDuration="5.810220566s" podCreationTimestamp="2026-03-20 18:02:41 +0000 UTC" firstStartedPulling="2026-03-20 18:02:43.756106113 +0000 UTC m=+2773.667698183" lastFinishedPulling="2026-03-20 18:02:46.280669413 +0000 UTC m=+2776.192261483" observedRunningTime="2026-03-20 18:02:46.802906658 +0000 UTC m=+2776.714498758" watchObservedRunningTime="2026-03-20 18:02:46.810220566 +0000 UTC m=+2776.721812636" Mar 20 18:02:52 crc kubenswrapper[4803]: I0320 18:02:52.094604 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:52 crc kubenswrapper[4803]: I0320 18:02:52.097083 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:52 crc kubenswrapper[4803]: I0320 18:02:52.169005 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:52 crc kubenswrapper[4803]: I0320 18:02:52.913415 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:52 crc kubenswrapper[4803]: I0320 18:02:52.971050 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c4p6v"] Mar 20 18:02:54 crc kubenswrapper[4803]: I0320 18:02:54.863937 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c4p6v" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" containerName="registry-server" containerID="cri-o://375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca" gracePeriod=2 Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.427203 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.544698 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-catalog-content\") pod \"91c783e6-814e-4bb7-8835-178e4fae327f\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.544816 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-utilities\") pod \"91c783e6-814e-4bb7-8835-178e4fae327f\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.544879 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/91c783e6-814e-4bb7-8835-178e4fae327f-kube-api-access-kv8x2\") pod \"91c783e6-814e-4bb7-8835-178e4fae327f\" (UID: \"91c783e6-814e-4bb7-8835-178e4fae327f\") " Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.546238 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-utilities" (OuterVolumeSpecName: "utilities") pod "91c783e6-814e-4bb7-8835-178e4fae327f" (UID: "91c783e6-814e-4bb7-8835-178e4fae327f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.552718 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91c783e6-814e-4bb7-8835-178e4fae327f-kube-api-access-kv8x2" (OuterVolumeSpecName: "kube-api-access-kv8x2") pod "91c783e6-814e-4bb7-8835-178e4fae327f" (UID: "91c783e6-814e-4bb7-8835-178e4fae327f"). InnerVolumeSpecName "kube-api-access-kv8x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.594751 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91c783e6-814e-4bb7-8835-178e4fae327f" (UID: "91c783e6-814e-4bb7-8835-178e4fae327f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.647838 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.647905 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kv8x2\" (UniqueName: \"kubernetes.io/projected/91c783e6-814e-4bb7-8835-178e4fae327f-kube-api-access-kv8x2\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.647934 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91c783e6-814e-4bb7-8835-178e4fae327f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.878448 4803 generic.go:334] "Generic (PLEG): container finished" podID="91c783e6-814e-4bb7-8835-178e4fae327f" containerID="375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca" exitCode=0 Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.878518 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4p6v" event={"ID":"91c783e6-814e-4bb7-8835-178e4fae327f","Type":"ContainerDied","Data":"375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca"} Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.878600 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4p6v" event={"ID":"91c783e6-814e-4bb7-8835-178e4fae327f","Type":"ContainerDied","Data":"78454380e640a12283492277bd8ea8644722ff073ee9fe07a01336c37a385ae2"} Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.878672 4803 scope.go:117] "RemoveContainer" containerID="375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.878887 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4p6v" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.924938 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c4p6v"] Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.929186 4803 scope.go:117] "RemoveContainer" containerID="ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce" Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.936273 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c4p6v"] Mar 20 18:02:55 crc kubenswrapper[4803]: I0320 18:02:55.964735 4803 scope.go:117] "RemoveContainer" containerID="c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c" Mar 20 18:02:56 crc kubenswrapper[4803]: I0320 18:02:56.048791 4803 scope.go:117] "RemoveContainer" containerID="375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca" Mar 20 18:02:56 crc kubenswrapper[4803]: E0320 18:02:56.049423 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca\": container with ID starting with 375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca not found: ID does not exist" containerID="375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca" Mar 20 18:02:56 crc kubenswrapper[4803]: I0320 18:02:56.049474 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca"} err="failed to get container status \"375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca\": rpc error: code = NotFound desc = could not find container \"375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca\": container with ID starting with 375866bce5a243e714b7f63d4eba8e607035f410888f37243aae003d0116b5ca not found: ID does not exist" Mar 20 18:02:56 crc kubenswrapper[4803]: I0320 18:02:56.049511 4803 scope.go:117] "RemoveContainer" containerID="ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce" Mar 20 18:02:56 crc kubenswrapper[4803]: E0320 18:02:56.050042 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce\": container with ID starting with ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce not found: ID does not exist" containerID="ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce" Mar 20 18:02:56 crc kubenswrapper[4803]: I0320 18:02:56.050084 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce"} err="failed to get container status \"ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce\": rpc error: code = NotFound desc = could not find container \"ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce\": container with ID starting with ac7fba188b3083a124ebe3a1de31b66c8dda92cb2559520caadb76e1d695dbce not found: ID does not exist" Mar 20 18:02:56 crc kubenswrapper[4803]: I0320 18:02:56.050113 4803 scope.go:117] "RemoveContainer" containerID="c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c" Mar 20 18:02:56 crc kubenswrapper[4803]: E0320 18:02:56.050439 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c\": container with ID starting with c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c not found: ID does not exist" containerID="c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c" Mar 20 18:02:56 crc kubenswrapper[4803]: I0320 18:02:56.050487 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c"} err="failed to get container status \"c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c\": rpc error: code = NotFound desc = could not find container \"c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c\": container with ID starting with c6b3b76a644fc79618d88de1ed385473148d11339462866014d2f22ce2e2130c not found: ID does not exist" Mar 20 18:02:56 crc kubenswrapper[4803]: I0320 18:02:56.869989 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" path="/var/lib/kubelet/pods/91c783e6-814e-4bb7-8835-178e4fae327f/volumes" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.155203 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567164-v6nb5"] Mar 20 18:04:00 crc kubenswrapper[4803]: E0320 18:04:00.156153 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" containerName="extract-content" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.156167 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" containerName="extract-content" Mar 20 18:04:00 crc kubenswrapper[4803]: E0320 18:04:00.156181 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" containerName="extract-utilities" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.156189 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" containerName="extract-utilities" Mar 20 18:04:00 crc kubenswrapper[4803]: E0320 18:04:00.156225 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" containerName="registry-server" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.156233 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" containerName="registry-server" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.156425 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="91c783e6-814e-4bb7-8835-178e4fae327f" containerName="registry-server" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.157067 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-v6nb5" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.160831 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.161118 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.161745 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.169352 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-v6nb5"] Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.337642 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m25jq\" (UniqueName: \"kubernetes.io/projected/156d367b-5c12-4745-9cef-7605b33c7a71-kube-api-access-m25jq\") pod \"auto-csr-approver-29567164-v6nb5\" (UID: \"156d367b-5c12-4745-9cef-7605b33c7a71\") " pod="openshift-infra/auto-csr-approver-29567164-v6nb5" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.439044 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m25jq\" (UniqueName: \"kubernetes.io/projected/156d367b-5c12-4745-9cef-7605b33c7a71-kube-api-access-m25jq\") pod \"auto-csr-approver-29567164-v6nb5\" (UID: \"156d367b-5c12-4745-9cef-7605b33c7a71\") " pod="openshift-infra/auto-csr-approver-29567164-v6nb5" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.466397 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m25jq\" (UniqueName: \"kubernetes.io/projected/156d367b-5c12-4745-9cef-7605b33c7a71-kube-api-access-m25jq\") pod \"auto-csr-approver-29567164-v6nb5\" (UID: \"156d367b-5c12-4745-9cef-7605b33c7a71\") " pod="openshift-infra/auto-csr-approver-29567164-v6nb5" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.480041 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-v6nb5" Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.970123 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-v6nb5"] Mar 20 18:04:00 crc kubenswrapper[4803]: I0320 18:04:00.977729 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:04:01 crc kubenswrapper[4803]: I0320 18:04:01.544145 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-v6nb5" event={"ID":"156d367b-5c12-4745-9cef-7605b33c7a71","Type":"ContainerStarted","Data":"0b0c6870f3f35a798b3cc89020c55a229890ca6b70e9954641e315da7a3d37f2"} Mar 20 18:04:02 crc kubenswrapper[4803]: I0320 18:04:02.553143 4803 generic.go:334] "Generic (PLEG): container finished" podID="156d367b-5c12-4745-9cef-7605b33c7a71" containerID="6d5f00c2965e823b65ee19aa5013e77526b90fa5e09f43e86a1c394368dc9040" exitCode=0 Mar 20 18:04:02 crc kubenswrapper[4803]: I0320 18:04:02.553690 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-v6nb5" event={"ID":"156d367b-5c12-4745-9cef-7605b33c7a71","Type":"ContainerDied","Data":"6d5f00c2965e823b65ee19aa5013e77526b90fa5e09f43e86a1c394368dc9040"} Mar 20 18:04:03 crc kubenswrapper[4803]: I0320 18:04:03.944609 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-v6nb5" Mar 20 18:04:04 crc kubenswrapper[4803]: I0320 18:04:04.108341 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m25jq\" (UniqueName: \"kubernetes.io/projected/156d367b-5c12-4745-9cef-7605b33c7a71-kube-api-access-m25jq\") pod \"156d367b-5c12-4745-9cef-7605b33c7a71\" (UID: \"156d367b-5c12-4745-9cef-7605b33c7a71\") " Mar 20 18:04:04 crc kubenswrapper[4803]: I0320 18:04:04.126866 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156d367b-5c12-4745-9cef-7605b33c7a71-kube-api-access-m25jq" (OuterVolumeSpecName: "kube-api-access-m25jq") pod "156d367b-5c12-4745-9cef-7605b33c7a71" (UID: "156d367b-5c12-4745-9cef-7605b33c7a71"). InnerVolumeSpecName "kube-api-access-m25jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:04:04 crc kubenswrapper[4803]: I0320 18:04:04.210630 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m25jq\" (UniqueName: \"kubernetes.io/projected/156d367b-5c12-4745-9cef-7605b33c7a71-kube-api-access-m25jq\") on node \"crc\" DevicePath \"\"" Mar 20 18:04:04 crc kubenswrapper[4803]: I0320 18:04:04.576713 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567164-v6nb5" event={"ID":"156d367b-5c12-4745-9cef-7605b33c7a71","Type":"ContainerDied","Data":"0b0c6870f3f35a798b3cc89020c55a229890ca6b70e9954641e315da7a3d37f2"} Mar 20 18:04:04 crc kubenswrapper[4803]: I0320 18:04:04.576755 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b0c6870f3f35a798b3cc89020c55a229890ca6b70e9954641e315da7a3d37f2" Mar 20 18:04:04 crc kubenswrapper[4803]: I0320 18:04:04.576811 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567164-v6nb5" Mar 20 18:04:05 crc kubenswrapper[4803]: I0320 18:04:05.027337 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-fwcck"] Mar 20 18:04:05 crc kubenswrapper[4803]: I0320 18:04:05.035667 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567158-fwcck"] Mar 20 18:04:06 crc kubenswrapper[4803]: I0320 18:04:06.866518 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="914f86f5-acbe-44ab-91fe-64f51ecade8c" path="/var/lib/kubelet/pods/914f86f5-acbe-44ab-91fe-64f51ecade8c/volumes" Mar 20 18:04:38 crc kubenswrapper[4803]: I0320 18:04:38.246191 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:04:38 crc kubenswrapper[4803]: I0320 18:04:38.246784 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:04:40 crc kubenswrapper[4803]: I0320 18:04:40.153694 4803 scope.go:117] "RemoveContainer" containerID="7bf6147199fff9e54736cad781cda74747d329b599569d0afcf59b590e982ef1" Mar 20 18:05:08 crc kubenswrapper[4803]: I0320 18:05:08.245851 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:05:08 crc kubenswrapper[4803]: I0320 18:05:08.246490 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.662732 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c9f26"] Mar 20 18:05:22 crc kubenswrapper[4803]: E0320 18:05:22.663715 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156d367b-5c12-4745-9cef-7605b33c7a71" containerName="oc" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.663732 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="156d367b-5c12-4745-9cef-7605b33c7a71" containerName="oc" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.664003 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="156d367b-5c12-4745-9cef-7605b33c7a71" containerName="oc" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.665903 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.678233 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqr8\" (UniqueName: \"kubernetes.io/projected/7c0cfea4-f7be-4e49-b82e-c9745fda5299-kube-api-access-plqr8\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.678448 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-utilities\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.678705 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-catalog-content\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.685170 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9f26"] Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.779338 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-catalog-content\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.779402 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plqr8\" (UniqueName: \"kubernetes.io/projected/7c0cfea4-f7be-4e49-b82e-c9745fda5299-kube-api-access-plqr8\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.779481 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-utilities\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.780787 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-utilities\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.780982 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-catalog-content\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:22 crc kubenswrapper[4803]: I0320 18:05:22.801992 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqr8\" (UniqueName: \"kubernetes.io/projected/7c0cfea4-f7be-4e49-b82e-c9745fda5299-kube-api-access-plqr8\") pod \"redhat-operators-c9f26\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:23 crc kubenswrapper[4803]: I0320 18:05:23.051167 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:23 crc kubenswrapper[4803]: I0320 18:05:23.518467 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9f26"] Mar 20 18:05:23 crc kubenswrapper[4803]: E0320 18:05:23.888101 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0cfea4_f7be_4e49_b82e_c9745fda5299.slice/crio-conmon-1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0cfea4_f7be_4e49_b82e_c9745fda5299.slice/crio-1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6.scope\": RecentStats: unable to find data in memory cache]" Mar 20 18:05:24 crc kubenswrapper[4803]: I0320 18:05:24.368366 4803 generic.go:334] "Generic (PLEG): container finished" podID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerID="1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6" exitCode=0 Mar 20 18:05:24 crc kubenswrapper[4803]: I0320 18:05:24.368409 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9f26" event={"ID":"7c0cfea4-f7be-4e49-b82e-c9745fda5299","Type":"ContainerDied","Data":"1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6"} Mar 20 18:05:24 crc kubenswrapper[4803]: I0320 18:05:24.368609 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9f26" event={"ID":"7c0cfea4-f7be-4e49-b82e-c9745fda5299","Type":"ContainerStarted","Data":"e546d167f6eaf2535d0f449fb3a1c2676e0b302409c336a1faaf1789d47ca614"} Mar 20 18:05:26 crc kubenswrapper[4803]: I0320 18:05:26.395920 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9f26" event={"ID":"7c0cfea4-f7be-4e49-b82e-c9745fda5299","Type":"ContainerStarted","Data":"7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8"} Mar 20 18:05:31 crc kubenswrapper[4803]: I0320 18:05:31.475042 4803 generic.go:334] "Generic (PLEG): container finished" podID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerID="7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8" exitCode=0 Mar 20 18:05:31 crc kubenswrapper[4803]: I0320 18:05:31.475120 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9f26" event={"ID":"7c0cfea4-f7be-4e49-b82e-c9745fda5299","Type":"ContainerDied","Data":"7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8"} Mar 20 18:05:32 crc kubenswrapper[4803]: I0320 18:05:32.489856 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9f26" event={"ID":"7c0cfea4-f7be-4e49-b82e-c9745fda5299","Type":"ContainerStarted","Data":"7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e"} Mar 20 18:05:32 crc kubenswrapper[4803]: I0320 18:05:32.515721 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c9f26" podStartSLOduration=2.924100052 podStartE2EDuration="10.515694215s" podCreationTimestamp="2026-03-20 18:05:22 +0000 UTC" firstStartedPulling="2026-03-20 18:05:24.370126716 +0000 UTC m=+2934.281718786" lastFinishedPulling="2026-03-20 18:05:31.961720839 +0000 UTC m=+2941.873312949" observedRunningTime="2026-03-20 18:05:32.507130082 +0000 UTC m=+2942.418722182" watchObservedRunningTime="2026-03-20 18:05:32.515694215 +0000 UTC m=+2942.427286305" Mar 20 18:05:33 crc kubenswrapper[4803]: I0320 18:05:33.051434 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:33 crc kubenswrapper[4803]: I0320 18:05:33.051838 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:34 crc kubenswrapper[4803]: I0320 18:05:34.107674 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c9f26" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="registry-server" probeResult="failure" output=< Mar 20 18:05:34 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 18:05:34 crc kubenswrapper[4803]: > Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.246088 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.246611 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.246662 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.247461 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.247548 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" gracePeriod=600 Mar 20 18:05:38 crc kubenswrapper[4803]: E0320 18:05:38.373318 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.557369 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" exitCode=0 Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.557419 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4"} Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.557456 4803 scope.go:117] "RemoveContainer" containerID="b679b259902a7258f1f12079211ce05fbe7129cf0f66c72791a1fe4ab937b8bf" Mar 20 18:05:38 crc kubenswrapper[4803]: I0320 18:05:38.558234 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:05:38 crc kubenswrapper[4803]: E0320 18:05:38.558574 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:05:43 crc kubenswrapper[4803]: I0320 18:05:43.114289 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:43 crc kubenswrapper[4803]: I0320 18:05:43.174791 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:43 crc kubenswrapper[4803]: I0320 18:05:43.361992 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c9f26"] Mar 20 18:05:44 crc kubenswrapper[4803]: I0320 18:05:44.622001 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c9f26" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="registry-server" containerID="cri-o://7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e" gracePeriod=2 Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.112488 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.253409 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-utilities\") pod \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.253660 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plqr8\" (UniqueName: \"kubernetes.io/projected/7c0cfea4-f7be-4e49-b82e-c9745fda5299-kube-api-access-plqr8\") pod \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.253829 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-catalog-content\") pod \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\" (UID: \"7c0cfea4-f7be-4e49-b82e-c9745fda5299\") " Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.255772 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-utilities" (OuterVolumeSpecName: "utilities") pod "7c0cfea4-f7be-4e49-b82e-c9745fda5299" (UID: "7c0cfea4-f7be-4e49-b82e-c9745fda5299"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.261169 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0cfea4-f7be-4e49-b82e-c9745fda5299-kube-api-access-plqr8" (OuterVolumeSpecName: "kube-api-access-plqr8") pod "7c0cfea4-f7be-4e49-b82e-c9745fda5299" (UID: "7c0cfea4-f7be-4e49-b82e-c9745fda5299"). InnerVolumeSpecName "kube-api-access-plqr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.357062 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.357140 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plqr8\" (UniqueName: \"kubernetes.io/projected/7c0cfea4-f7be-4e49-b82e-c9745fda5299-kube-api-access-plqr8\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.429712 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c0cfea4-f7be-4e49-b82e-c9745fda5299" (UID: "7c0cfea4-f7be-4e49-b82e-c9745fda5299"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.459283 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0cfea4-f7be-4e49-b82e-c9745fda5299-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.634241 4803 generic.go:334] "Generic (PLEG): container finished" podID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerID="7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e" exitCode=0 Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.634287 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9f26" event={"ID":"7c0cfea4-f7be-4e49-b82e-c9745fda5299","Type":"ContainerDied","Data":"7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e"} Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.634323 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9f26" event={"ID":"7c0cfea4-f7be-4e49-b82e-c9745fda5299","Type":"ContainerDied","Data":"e546d167f6eaf2535d0f449fb3a1c2676e0b302409c336a1faaf1789d47ca614"} Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.634339 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9f26" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.634345 4803 scope.go:117] "RemoveContainer" containerID="7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.673309 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c9f26"] Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.675453 4803 scope.go:117] "RemoveContainer" containerID="7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.682625 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c9f26"] Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.702149 4803 scope.go:117] "RemoveContainer" containerID="1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.745909 4803 scope.go:117] "RemoveContainer" containerID="7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e" Mar 20 18:05:45 crc kubenswrapper[4803]: E0320 18:05:45.746484 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e\": container with ID starting with 7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e not found: ID does not exist" containerID="7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.746573 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e"} err="failed to get container status \"7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e\": rpc error: code = NotFound desc = could not find container \"7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e\": container with ID starting with 7b71d245d5f5b2ab9160987326a0e09da6b9d3c96397b04ec212312de948d00e not found: ID does not exist" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.746618 4803 scope.go:117] "RemoveContainer" containerID="7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8" Mar 20 18:05:45 crc kubenswrapper[4803]: E0320 18:05:45.747053 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8\": container with ID starting with 7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8 not found: ID does not exist" containerID="7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.747096 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8"} err="failed to get container status \"7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8\": rpc error: code = NotFound desc = could not find container \"7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8\": container with ID starting with 7dd372f67f03f671559f6ec0e740449b0ef9329f87bb4d87e55d6151e53fa3f8 not found: ID does not exist" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.747119 4803 scope.go:117] "RemoveContainer" containerID="1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6" Mar 20 18:05:45 crc kubenswrapper[4803]: E0320 18:05:45.747450 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6\": container with ID starting with 1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6 not found: ID does not exist" containerID="1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6" Mar 20 18:05:45 crc kubenswrapper[4803]: I0320 18:05:45.747499 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6"} err="failed to get container status \"1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6\": rpc error: code = NotFound desc = could not find container \"1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6\": container with ID starting with 1ee2cfd765a8f2846e5b11f513841fdbc3ba3525f309398cb31cef0325908ce6 not found: ID does not exist" Mar 20 18:05:46 crc kubenswrapper[4803]: I0320 18:05:46.863008 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" path="/var/lib/kubelet/pods/7c0cfea4-f7be-4e49-b82e-c9745fda5299/volumes" Mar 20 18:05:50 crc kubenswrapper[4803]: I0320 18:05:50.859755 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:05:50 crc kubenswrapper[4803]: E0320 18:05:50.860976 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.147405 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567166-qf59q"] Mar 20 18:06:00 crc kubenswrapper[4803]: E0320 18:06:00.148459 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="extract-content" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.148474 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="extract-content" Mar 20 18:06:00 crc kubenswrapper[4803]: E0320 18:06:00.148500 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="extract-utilities" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.148509 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="extract-utilities" Mar 20 18:06:00 crc kubenswrapper[4803]: E0320 18:06:00.148555 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="registry-server" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.148563 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="registry-server" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.148762 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0cfea4-f7be-4e49-b82e-c9745fda5299" containerName="registry-server" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.149487 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-qf59q" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.152860 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.153128 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.153360 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.164225 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-qf59q"] Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.211025 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flmcg\" (UniqueName: \"kubernetes.io/projected/e6481af1-e610-404e-a4ab-8ea3d0b98b5a-kube-api-access-flmcg\") pod \"auto-csr-approver-29567166-qf59q\" (UID: \"e6481af1-e610-404e-a4ab-8ea3d0b98b5a\") " pod="openshift-infra/auto-csr-approver-29567166-qf59q" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.311696 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flmcg\" (UniqueName: \"kubernetes.io/projected/e6481af1-e610-404e-a4ab-8ea3d0b98b5a-kube-api-access-flmcg\") pod \"auto-csr-approver-29567166-qf59q\" (UID: \"e6481af1-e610-404e-a4ab-8ea3d0b98b5a\") " pod="openshift-infra/auto-csr-approver-29567166-qf59q" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.334663 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flmcg\" (UniqueName: \"kubernetes.io/projected/e6481af1-e610-404e-a4ab-8ea3d0b98b5a-kube-api-access-flmcg\") pod \"auto-csr-approver-29567166-qf59q\" (UID: \"e6481af1-e610-404e-a4ab-8ea3d0b98b5a\") " pod="openshift-infra/auto-csr-approver-29567166-qf59q" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.501968 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-qf59q" Mar 20 18:06:00 crc kubenswrapper[4803]: I0320 18:06:00.938638 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-qf59q"] Mar 20 18:06:01 crc kubenswrapper[4803]: I0320 18:06:01.808021 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-qf59q" event={"ID":"e6481af1-e610-404e-a4ab-8ea3d0b98b5a","Type":"ContainerStarted","Data":"bfb52d64936becc567ac5436e29b117a17c3857528d2f3524ba52cbddd9c9d71"} Mar 20 18:06:02 crc kubenswrapper[4803]: I0320 18:06:02.824320 4803 generic.go:334] "Generic (PLEG): container finished" podID="e6481af1-e610-404e-a4ab-8ea3d0b98b5a" containerID="034ffccd05b10cedc45cf8e32e8b973a733d945d3cd4514451adefa1d84d701c" exitCode=0 Mar 20 18:06:02 crc kubenswrapper[4803]: I0320 18:06:02.824427 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-qf59q" event={"ID":"e6481af1-e610-404e-a4ab-8ea3d0b98b5a","Type":"ContainerDied","Data":"034ffccd05b10cedc45cf8e32e8b973a733d945d3cd4514451adefa1d84d701c"} Mar 20 18:06:02 crc kubenswrapper[4803]: I0320 18:06:02.848822 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:06:02 crc kubenswrapper[4803]: E0320 18:06:02.849350 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:06:04 crc kubenswrapper[4803]: I0320 18:06:04.256206 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-qf59q" Mar 20 18:06:04 crc kubenswrapper[4803]: I0320 18:06:04.393565 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flmcg\" (UniqueName: \"kubernetes.io/projected/e6481af1-e610-404e-a4ab-8ea3d0b98b5a-kube-api-access-flmcg\") pod \"e6481af1-e610-404e-a4ab-8ea3d0b98b5a\" (UID: \"e6481af1-e610-404e-a4ab-8ea3d0b98b5a\") " Mar 20 18:06:04 crc kubenswrapper[4803]: I0320 18:06:04.421859 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6481af1-e610-404e-a4ab-8ea3d0b98b5a-kube-api-access-flmcg" (OuterVolumeSpecName: "kube-api-access-flmcg") pod "e6481af1-e610-404e-a4ab-8ea3d0b98b5a" (UID: "e6481af1-e610-404e-a4ab-8ea3d0b98b5a"). InnerVolumeSpecName "kube-api-access-flmcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:06:04 crc kubenswrapper[4803]: I0320 18:06:04.496139 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flmcg\" (UniqueName: \"kubernetes.io/projected/e6481af1-e610-404e-a4ab-8ea3d0b98b5a-kube-api-access-flmcg\") on node \"crc\" DevicePath \"\"" Mar 20 18:06:04 crc kubenswrapper[4803]: I0320 18:06:04.857916 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567166-qf59q" Mar 20 18:06:04 crc kubenswrapper[4803]: I0320 18:06:04.863385 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567166-qf59q" event={"ID":"e6481af1-e610-404e-a4ab-8ea3d0b98b5a","Type":"ContainerDied","Data":"bfb52d64936becc567ac5436e29b117a17c3857528d2f3524ba52cbddd9c9d71"} Mar 20 18:06:04 crc kubenswrapper[4803]: I0320 18:06:04.863455 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb52d64936becc567ac5436e29b117a17c3857528d2f3524ba52cbddd9c9d71" Mar 20 18:06:05 crc kubenswrapper[4803]: I0320 18:06:05.353652 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-99nnx"] Mar 20 18:06:05 crc kubenswrapper[4803]: I0320 18:06:05.364861 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567160-99nnx"] Mar 20 18:06:06 crc kubenswrapper[4803]: I0320 18:06:06.860791 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="621c23a1-5636-42c7-a060-bdec3307c552" path="/var/lib/kubelet/pods/621c23a1-5636-42c7-a060-bdec3307c552/volumes" Mar 20 18:06:13 crc kubenswrapper[4803]: I0320 18:06:13.848122 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:06:13 crc kubenswrapper[4803]: E0320 18:06:13.848814 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:06:28 crc kubenswrapper[4803]: I0320 18:06:28.848999 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:06:28 crc kubenswrapper[4803]: E0320 18:06:28.850788 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:06:40 crc kubenswrapper[4803]: I0320 18:06:40.265258 4803 scope.go:117] "RemoveContainer" containerID="3f8d588ef9b7f44fcd0eed192a1ff5614c3c25aae725a16509d9e8f8b962effd" Mar 20 18:06:42 crc kubenswrapper[4803]: I0320 18:06:42.850800 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:06:42 crc kubenswrapper[4803]: E0320 18:06:42.852220 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:06:54 crc kubenswrapper[4803]: I0320 18:06:54.848556 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:06:54 crc kubenswrapper[4803]: E0320 18:06:54.849916 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:07:08 crc kubenswrapper[4803]: I0320 18:07:08.849322 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:07:08 crc kubenswrapper[4803]: E0320 18:07:08.850325 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:07:22 crc kubenswrapper[4803]: I0320 18:07:22.854395 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:07:22 crc kubenswrapper[4803]: E0320 18:07:22.855371 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:07:34 crc kubenswrapper[4803]: I0320 18:07:34.848697 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:07:34 crc kubenswrapper[4803]: E0320 18:07:34.849471 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:07:48 crc kubenswrapper[4803]: I0320 18:07:48.848950 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:07:48 crc kubenswrapper[4803]: E0320 18:07:48.850428 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.156405 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567168-wfvfr"] Mar 20 18:08:00 crc kubenswrapper[4803]: E0320 18:08:00.157722 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6481af1-e610-404e-a4ab-8ea3d0b98b5a" containerName="oc" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.157745 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6481af1-e610-404e-a4ab-8ea3d0b98b5a" containerName="oc" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.158110 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6481af1-e610-404e-a4ab-8ea3d0b98b5a" containerName="oc" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.159115 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-wfvfr" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.162211 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.164916 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.164947 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.168087 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-wfvfr"] Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.286903 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4kd\" (UniqueName: \"kubernetes.io/projected/747b56de-18b6-4ee4-b404-b93c6061c80b-kube-api-access-xs4kd\") pod \"auto-csr-approver-29567168-wfvfr\" (UID: \"747b56de-18b6-4ee4-b404-b93c6061c80b\") " pod="openshift-infra/auto-csr-approver-29567168-wfvfr" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.389450 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs4kd\" (UniqueName: \"kubernetes.io/projected/747b56de-18b6-4ee4-b404-b93c6061c80b-kube-api-access-xs4kd\") pod \"auto-csr-approver-29567168-wfvfr\" (UID: \"747b56de-18b6-4ee4-b404-b93c6061c80b\") " pod="openshift-infra/auto-csr-approver-29567168-wfvfr" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.412055 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs4kd\" (UniqueName: \"kubernetes.io/projected/747b56de-18b6-4ee4-b404-b93c6061c80b-kube-api-access-xs4kd\") pod \"auto-csr-approver-29567168-wfvfr\" (UID: \"747b56de-18b6-4ee4-b404-b93c6061c80b\") " pod="openshift-infra/auto-csr-approver-29567168-wfvfr" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.482935 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-wfvfr" Mar 20 18:08:00 crc kubenswrapper[4803]: I0320 18:08:00.860304 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:08:00 crc kubenswrapper[4803]: E0320 18:08:00.861797 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:08:01 crc kubenswrapper[4803]: I0320 18:08:01.037226 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-wfvfr"] Mar 20 18:08:01 crc kubenswrapper[4803]: I0320 18:08:01.076712 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-wfvfr" event={"ID":"747b56de-18b6-4ee4-b404-b93c6061c80b","Type":"ContainerStarted","Data":"7c34dfd410072ae71fe42061e41048983cb5a3b65ac9acafb211f248c97af567"} Mar 20 18:08:03 crc kubenswrapper[4803]: I0320 18:08:03.097983 4803 generic.go:334] "Generic (PLEG): container finished" podID="747b56de-18b6-4ee4-b404-b93c6061c80b" containerID="ab16407a8decb866d30b5bdae345dca99933946ee4c620731b2b9bf5b8d3d5b0" exitCode=0 Mar 20 18:08:03 crc kubenswrapper[4803]: I0320 18:08:03.098144 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-wfvfr" event={"ID":"747b56de-18b6-4ee4-b404-b93c6061c80b","Type":"ContainerDied","Data":"ab16407a8decb866d30b5bdae345dca99933946ee4c620731b2b9bf5b8d3d5b0"} Mar 20 18:08:04 crc kubenswrapper[4803]: I0320 18:08:04.481947 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-wfvfr" Mar 20 18:08:04 crc kubenswrapper[4803]: I0320 18:08:04.577144 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs4kd\" (UniqueName: \"kubernetes.io/projected/747b56de-18b6-4ee4-b404-b93c6061c80b-kube-api-access-xs4kd\") pod \"747b56de-18b6-4ee4-b404-b93c6061c80b\" (UID: \"747b56de-18b6-4ee4-b404-b93c6061c80b\") " Mar 20 18:08:04 crc kubenswrapper[4803]: I0320 18:08:04.584168 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/747b56de-18b6-4ee4-b404-b93c6061c80b-kube-api-access-xs4kd" (OuterVolumeSpecName: "kube-api-access-xs4kd") pod "747b56de-18b6-4ee4-b404-b93c6061c80b" (UID: "747b56de-18b6-4ee4-b404-b93c6061c80b"). InnerVolumeSpecName "kube-api-access-xs4kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:08:04 crc kubenswrapper[4803]: I0320 18:08:04.679726 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs4kd\" (UniqueName: \"kubernetes.io/projected/747b56de-18b6-4ee4-b404-b93c6061c80b-kube-api-access-xs4kd\") on node \"crc\" DevicePath \"\"" Mar 20 18:08:05 crc kubenswrapper[4803]: I0320 18:08:05.122938 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567168-wfvfr" event={"ID":"747b56de-18b6-4ee4-b404-b93c6061c80b","Type":"ContainerDied","Data":"7c34dfd410072ae71fe42061e41048983cb5a3b65ac9acafb211f248c97af567"} Mar 20 18:08:05 crc kubenswrapper[4803]: I0320 18:08:05.123013 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c34dfd410072ae71fe42061e41048983cb5a3b65ac9acafb211f248c97af567" Mar 20 18:08:05 crc kubenswrapper[4803]: I0320 18:08:05.123014 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567168-wfvfr" Mar 20 18:08:05 crc kubenswrapper[4803]: I0320 18:08:05.561118 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-nrcjv"] Mar 20 18:08:05 crc kubenswrapper[4803]: I0320 18:08:05.571794 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567162-nrcjv"] Mar 20 18:08:06 crc kubenswrapper[4803]: I0320 18:08:06.858256 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cd5358-529a-401b-ae71-b30f67d7e9c0" path="/var/lib/kubelet/pods/c2cd5358-529a-401b-ae71-b30f67d7e9c0/volumes" Mar 20 18:08:11 crc kubenswrapper[4803]: I0320 18:08:11.848887 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:08:11 crc kubenswrapper[4803]: E0320 18:08:11.850229 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:08:26 crc kubenswrapper[4803]: I0320 18:08:26.847793 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:08:26 crc kubenswrapper[4803]: E0320 18:08:26.848676 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:08:37 crc kubenswrapper[4803]: I0320 18:08:37.848929 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:08:37 crc kubenswrapper[4803]: E0320 18:08:37.850045 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:08:40 crc kubenswrapper[4803]: I0320 18:08:40.371133 4803 scope.go:117] "RemoveContainer" containerID="70ebf02357aa4015af3f7526f6875647c07cacb1cd69621c18d337acfa4b8577" Mar 20 18:08:50 crc kubenswrapper[4803]: I0320 18:08:50.859761 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:08:50 crc kubenswrapper[4803]: E0320 18:08:50.861302 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:09:02 crc kubenswrapper[4803]: I0320 18:09:02.849337 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:09:02 crc kubenswrapper[4803]: E0320 18:09:02.850319 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:09:14 crc kubenswrapper[4803]: I0320 18:09:14.848322 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:09:14 crc kubenswrapper[4803]: E0320 18:09:14.849642 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:09:27 crc kubenswrapper[4803]: I0320 18:09:27.848699 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:09:27 crc kubenswrapper[4803]: E0320 18:09:27.850421 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:09:38 crc kubenswrapper[4803]: I0320 18:09:38.849498 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:09:38 crc kubenswrapper[4803]: E0320 18:09:38.850498 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:09:53 crc kubenswrapper[4803]: I0320 18:09:53.848238 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:09:53 crc kubenswrapper[4803]: E0320 18:09:53.848992 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.210640 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567170-7dc7n"] Mar 20 18:10:00 crc kubenswrapper[4803]: E0320 18:10:00.212250 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="747b56de-18b6-4ee4-b404-b93c6061c80b" containerName="oc" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.212285 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="747b56de-18b6-4ee4-b404-b93c6061c80b" containerName="oc" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.212852 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="747b56de-18b6-4ee4-b404-b93c6061c80b" containerName="oc" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.214250 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.216812 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.217096 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.218034 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.237648 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-7dc7n"] Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.295829 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k88w\" (UniqueName: \"kubernetes.io/projected/e595cb0d-75b2-4615-9658-eaefa75ce503-kube-api-access-8k88w\") pod \"auto-csr-approver-29567170-7dc7n\" (UID: \"e595cb0d-75b2-4615-9658-eaefa75ce503\") " pod="openshift-infra/auto-csr-approver-29567170-7dc7n" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.397878 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k88w\" (UniqueName: \"kubernetes.io/projected/e595cb0d-75b2-4615-9658-eaefa75ce503-kube-api-access-8k88w\") pod \"auto-csr-approver-29567170-7dc7n\" (UID: \"e595cb0d-75b2-4615-9658-eaefa75ce503\") " pod="openshift-infra/auto-csr-approver-29567170-7dc7n" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.430763 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k88w\" (UniqueName: \"kubernetes.io/projected/e595cb0d-75b2-4615-9658-eaefa75ce503-kube-api-access-8k88w\") pod \"auto-csr-approver-29567170-7dc7n\" (UID: \"e595cb0d-75b2-4615-9658-eaefa75ce503\") " pod="openshift-infra/auto-csr-approver-29567170-7dc7n" Mar 20 18:10:00 crc kubenswrapper[4803]: I0320 18:10:00.558014 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" Mar 20 18:10:01 crc kubenswrapper[4803]: I0320 18:10:01.071139 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-7dc7n"] Mar 20 18:10:01 crc kubenswrapper[4803]: W0320 18:10:01.077997 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode595cb0d_75b2_4615_9658_eaefa75ce503.slice/crio-00c63dc871480b4939ee5baa906abfd8531b9e3129915b411772d80a6cc89b42 WatchSource:0}: Error finding container 00c63dc871480b4939ee5baa906abfd8531b9e3129915b411772d80a6cc89b42: Status 404 returned error can't find the container with id 00c63dc871480b4939ee5baa906abfd8531b9e3129915b411772d80a6cc89b42 Mar 20 18:10:01 crc kubenswrapper[4803]: I0320 18:10:01.082860 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:10:01 crc kubenswrapper[4803]: I0320 18:10:01.353155 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" event={"ID":"e595cb0d-75b2-4615-9658-eaefa75ce503","Type":"ContainerStarted","Data":"00c63dc871480b4939ee5baa906abfd8531b9e3129915b411772d80a6cc89b42"} Mar 20 18:10:03 crc kubenswrapper[4803]: I0320 18:10:03.380642 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" event={"ID":"e595cb0d-75b2-4615-9658-eaefa75ce503","Type":"ContainerStarted","Data":"b340f385277c882a25ec48ff66272ec25d7b577e7ce8c38b5f15a671480dcc1a"} Mar 20 18:10:03 crc kubenswrapper[4803]: I0320 18:10:03.401597 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" podStartSLOduration=1.5862921719999998 podStartE2EDuration="3.401580406s" podCreationTimestamp="2026-03-20 18:10:00 +0000 UTC" firstStartedPulling="2026-03-20 18:10:01.082570152 +0000 UTC m=+3210.994162232" lastFinishedPulling="2026-03-20 18:10:02.897858366 +0000 UTC m=+3212.809450466" observedRunningTime="2026-03-20 18:10:03.398022435 +0000 UTC m=+3213.309614505" watchObservedRunningTime="2026-03-20 18:10:03.401580406 +0000 UTC m=+3213.313172486" Mar 20 18:10:04 crc kubenswrapper[4803]: I0320 18:10:04.395732 4803 generic.go:334] "Generic (PLEG): container finished" podID="e595cb0d-75b2-4615-9658-eaefa75ce503" containerID="b340f385277c882a25ec48ff66272ec25d7b577e7ce8c38b5f15a671480dcc1a" exitCode=0 Mar 20 18:10:04 crc kubenswrapper[4803]: I0320 18:10:04.395797 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" event={"ID":"e595cb0d-75b2-4615-9658-eaefa75ce503","Type":"ContainerDied","Data":"b340f385277c882a25ec48ff66272ec25d7b577e7ce8c38b5f15a671480dcc1a"} Mar 20 18:10:05 crc kubenswrapper[4803]: I0320 18:10:05.837860 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" Mar 20 18:10:05 crc kubenswrapper[4803]: I0320 18:10:05.919559 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k88w\" (UniqueName: \"kubernetes.io/projected/e595cb0d-75b2-4615-9658-eaefa75ce503-kube-api-access-8k88w\") pod \"e595cb0d-75b2-4615-9658-eaefa75ce503\" (UID: \"e595cb0d-75b2-4615-9658-eaefa75ce503\") " Mar 20 18:10:05 crc kubenswrapper[4803]: I0320 18:10:05.934815 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e595cb0d-75b2-4615-9658-eaefa75ce503-kube-api-access-8k88w" (OuterVolumeSpecName: "kube-api-access-8k88w") pod "e595cb0d-75b2-4615-9658-eaefa75ce503" (UID: "e595cb0d-75b2-4615-9658-eaefa75ce503"). InnerVolumeSpecName "kube-api-access-8k88w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:10:06 crc kubenswrapper[4803]: I0320 18:10:06.023067 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k88w\" (UniqueName: \"kubernetes.io/projected/e595cb0d-75b2-4615-9658-eaefa75ce503-kube-api-access-8k88w\") on node \"crc\" DevicePath \"\"" Mar 20 18:10:06 crc kubenswrapper[4803]: I0320 18:10:06.412841 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" event={"ID":"e595cb0d-75b2-4615-9658-eaefa75ce503","Type":"ContainerDied","Data":"00c63dc871480b4939ee5baa906abfd8531b9e3129915b411772d80a6cc89b42"} Mar 20 18:10:06 crc kubenswrapper[4803]: I0320 18:10:06.412891 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00c63dc871480b4939ee5baa906abfd8531b9e3129915b411772d80a6cc89b42" Mar 20 18:10:06 crc kubenswrapper[4803]: I0320 18:10:06.412893 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567170-7dc7n" Mar 20 18:10:06 crc kubenswrapper[4803]: I0320 18:10:06.466366 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-v6nb5"] Mar 20 18:10:06 crc kubenswrapper[4803]: I0320 18:10:06.477770 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567164-v6nb5"] Mar 20 18:10:06 crc kubenswrapper[4803]: I0320 18:10:06.869383 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156d367b-5c12-4745-9cef-7605b33c7a71" path="/var/lib/kubelet/pods/156d367b-5c12-4745-9cef-7605b33c7a71/volumes" Mar 20 18:10:07 crc kubenswrapper[4803]: I0320 18:10:07.848240 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:10:07 crc kubenswrapper[4803]: E0320 18:10:07.848916 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:10:18 crc kubenswrapper[4803]: I0320 18:10:18.848852 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:10:18 crc kubenswrapper[4803]: E0320 18:10:18.850012 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:10:30 crc kubenswrapper[4803]: I0320 18:10:30.853049 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:10:30 crc kubenswrapper[4803]: E0320 18:10:30.853702 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:10:40 crc kubenswrapper[4803]: I0320 18:10:40.493731 4803 scope.go:117] "RemoveContainer" containerID="6d5f00c2965e823b65ee19aa5013e77526b90fa5e09f43e86a1c394368dc9040" Mar 20 18:10:44 crc kubenswrapper[4803]: I0320 18:10:44.848557 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:10:45 crc kubenswrapper[4803]: I0320 18:10:45.828684 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"2f3cc3c4717154afbdb4abc9029591c46f46936ced43a499885d3ba9d07fcb2b"} Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.472008 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-frcm9"] Mar 20 18:10:57 crc kubenswrapper[4803]: E0320 18:10:57.473269 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e595cb0d-75b2-4615-9658-eaefa75ce503" containerName="oc" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.473288 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e595cb0d-75b2-4615-9658-eaefa75ce503" containerName="oc" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.473538 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e595cb0d-75b2-4615-9658-eaefa75ce503" containerName="oc" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.476448 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.491966 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frcm9"] Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.612506 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r98bg\" (UniqueName: \"kubernetes.io/projected/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-kube-api-access-r98bg\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.612598 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-catalog-content\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.612681 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-utilities\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.714489 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-utilities\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.714634 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r98bg\" (UniqueName: \"kubernetes.io/projected/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-kube-api-access-r98bg\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.714681 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-catalog-content\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.715049 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-utilities\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.715305 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-catalog-content\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.740895 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r98bg\" (UniqueName: \"kubernetes.io/projected/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-kube-api-access-r98bg\") pod \"certified-operators-frcm9\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:57 crc kubenswrapper[4803]: I0320 18:10:57.841071 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.074085 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrbd"] Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.076166 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.088393 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrbd"] Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.124831 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q66xw\" (UniqueName: \"kubernetes.io/projected/3bc137df-0509-4deb-8c9e-123b905d4ecf-kube-api-access-q66xw\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.125074 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-catalog-content\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.125197 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-utilities\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.226727 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q66xw\" (UniqueName: \"kubernetes.io/projected/3bc137df-0509-4deb-8c9e-123b905d4ecf-kube-api-access-q66xw\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.226812 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-catalog-content\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.226874 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-utilities\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.227493 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-utilities\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.228003 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-catalog-content\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.267532 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q66xw\" (UniqueName: \"kubernetes.io/projected/3bc137df-0509-4deb-8c9e-123b905d4ecf-kube-api-access-q66xw\") pod \"redhat-marketplace-sbrbd\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.352346 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-frcm9"] Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.400055 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.719334 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrbd"] Mar 20 18:10:58 crc kubenswrapper[4803]: W0320 18:10:58.720730 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bc137df_0509_4deb_8c9e_123b905d4ecf.slice/crio-9f1b84a4504dab6e549e975aaf865ed6b3f3a91b338f9f04b2ed14f94baffd7e WatchSource:0}: Error finding container 9f1b84a4504dab6e549e975aaf865ed6b3f3a91b338f9f04b2ed14f94baffd7e: Status 404 returned error can't find the container with id 9f1b84a4504dab6e549e975aaf865ed6b3f3a91b338f9f04b2ed14f94baffd7e Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.955879 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrbd" event={"ID":"3bc137df-0509-4deb-8c9e-123b905d4ecf","Type":"ContainerStarted","Data":"5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb"} Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.955925 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrbd" event={"ID":"3bc137df-0509-4deb-8c9e-123b905d4ecf","Type":"ContainerStarted","Data":"9f1b84a4504dab6e549e975aaf865ed6b3f3a91b338f9f04b2ed14f94baffd7e"} Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.958740 4803 generic.go:334] "Generic (PLEG): container finished" podID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerID="a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3" exitCode=0 Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.958768 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcm9" event={"ID":"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27","Type":"ContainerDied","Data":"a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3"} Mar 20 18:10:58 crc kubenswrapper[4803]: I0320 18:10:58.958788 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcm9" event={"ID":"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27","Type":"ContainerStarted","Data":"2798393b692b6f35bba0072b150117ba41410bdab988bd85900f4b3fdaa0ee12"} Mar 20 18:10:59 crc kubenswrapper[4803]: I0320 18:10:59.976018 4803 generic.go:334] "Generic (PLEG): container finished" podID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerID="5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb" exitCode=0 Mar 20 18:10:59 crc kubenswrapper[4803]: I0320 18:10:59.976432 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrbd" event={"ID":"3bc137df-0509-4deb-8c9e-123b905d4ecf","Type":"ContainerDied","Data":"5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb"} Mar 20 18:10:59 crc kubenswrapper[4803]: I0320 18:10:59.976473 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrbd" event={"ID":"3bc137df-0509-4deb-8c9e-123b905d4ecf","Type":"ContainerStarted","Data":"95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2"} Mar 20 18:10:59 crc kubenswrapper[4803]: I0320 18:10:59.979473 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcm9" event={"ID":"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27","Type":"ContainerStarted","Data":"6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60"} Mar 20 18:11:00 crc kubenswrapper[4803]: I0320 18:11:00.993306 4803 generic.go:334] "Generic (PLEG): container finished" podID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerID="95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2" exitCode=0 Mar 20 18:11:00 crc kubenswrapper[4803]: I0320 18:11:00.994488 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrbd" event={"ID":"3bc137df-0509-4deb-8c9e-123b905d4ecf","Type":"ContainerDied","Data":"95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2"} Mar 20 18:11:00 crc kubenswrapper[4803]: I0320 18:11:00.999562 4803 generic.go:334] "Generic (PLEG): container finished" podID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerID="6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60" exitCode=0 Mar 20 18:11:00 crc kubenswrapper[4803]: I0320 18:11:00.999597 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcm9" event={"ID":"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27","Type":"ContainerDied","Data":"6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60"} Mar 20 18:11:02 crc kubenswrapper[4803]: I0320 18:11:02.010473 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrbd" event={"ID":"3bc137df-0509-4deb-8c9e-123b905d4ecf","Type":"ContainerStarted","Data":"c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4"} Mar 20 18:11:02 crc kubenswrapper[4803]: I0320 18:11:02.014177 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcm9" event={"ID":"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27","Type":"ContainerStarted","Data":"11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129"} Mar 20 18:11:02 crc kubenswrapper[4803]: I0320 18:11:02.040806 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbrbd" podStartSLOduration=1.5562459789999998 podStartE2EDuration="4.040785433s" podCreationTimestamp="2026-03-20 18:10:58 +0000 UTC" firstStartedPulling="2026-03-20 18:10:58.957782978 +0000 UTC m=+3268.869375058" lastFinishedPulling="2026-03-20 18:11:01.442322422 +0000 UTC m=+3271.353914512" observedRunningTime="2026-03-20 18:11:02.034332092 +0000 UTC m=+3271.945924202" watchObservedRunningTime="2026-03-20 18:11:02.040785433 +0000 UTC m=+3271.952377513" Mar 20 18:11:02 crc kubenswrapper[4803]: I0320 18:11:02.071652 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-frcm9" podStartSLOduration=2.541062423 podStartE2EDuration="5.07162592s" podCreationTimestamp="2026-03-20 18:10:57 +0000 UTC" firstStartedPulling="2026-03-20 18:10:58.961121012 +0000 UTC m=+3268.872713092" lastFinishedPulling="2026-03-20 18:11:01.491684479 +0000 UTC m=+3271.403276589" observedRunningTime="2026-03-20 18:11:02.059852319 +0000 UTC m=+3271.971444479" watchObservedRunningTime="2026-03-20 18:11:02.07162592 +0000 UTC m=+3271.983218020" Mar 20 18:11:07 crc kubenswrapper[4803]: I0320 18:11:07.841711 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:11:07 crc kubenswrapper[4803]: I0320 18:11:07.842268 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:11:07 crc kubenswrapper[4803]: I0320 18:11:07.897935 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:11:08 crc kubenswrapper[4803]: I0320 18:11:08.139096 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:11:08 crc kubenswrapper[4803]: I0320 18:11:08.191661 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frcm9"] Mar 20 18:11:08 crc kubenswrapper[4803]: I0320 18:11:08.400493 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:11:08 crc kubenswrapper[4803]: I0320 18:11:08.400639 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:11:08 crc kubenswrapper[4803]: I0320 18:11:08.446667 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:11:09 crc kubenswrapper[4803]: I0320 18:11:09.149007 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.100488 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-frcm9" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerName="registry-server" containerID="cri-o://11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129" gracePeriod=2 Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.556566 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrbd"] Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.642340 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.786649 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-catalog-content\") pod \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.786898 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-utilities\") pod \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.786937 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r98bg\" (UniqueName: \"kubernetes.io/projected/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-kube-api-access-r98bg\") pod \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\" (UID: \"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27\") " Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.788415 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-utilities" (OuterVolumeSpecName: "utilities") pod "91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" (UID: "91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.799032 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-kube-api-access-r98bg" (OuterVolumeSpecName: "kube-api-access-r98bg") pod "91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" (UID: "91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27"). InnerVolumeSpecName "kube-api-access-r98bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.839169 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" (UID: "91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.890554 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.890597 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r98bg\" (UniqueName: \"kubernetes.io/projected/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-kube-api-access-r98bg\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:10 crc kubenswrapper[4803]: I0320 18:11:10.890617 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.115361 4803 generic.go:334] "Generic (PLEG): container finished" podID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerID="11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129" exitCode=0 Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.115411 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-frcm9" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.115451 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcm9" event={"ID":"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27","Type":"ContainerDied","Data":"11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129"} Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.115506 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-frcm9" event={"ID":"91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27","Type":"ContainerDied","Data":"2798393b692b6f35bba0072b150117ba41410bdab988bd85900f4b3fdaa0ee12"} Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.115543 4803 scope.go:117] "RemoveContainer" containerID="11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.115996 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbrbd" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerName="registry-server" containerID="cri-o://c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4" gracePeriod=2 Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.142864 4803 scope.go:117] "RemoveContainer" containerID="6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.150446 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-frcm9"] Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.166825 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-frcm9"] Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.170592 4803 scope.go:117] "RemoveContainer" containerID="a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.303171 4803 scope.go:117] "RemoveContainer" containerID="11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129" Mar 20 18:11:11 crc kubenswrapper[4803]: E0320 18:11:11.303747 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129\": container with ID starting with 11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129 not found: ID does not exist" containerID="11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.303785 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129"} err="failed to get container status \"11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129\": rpc error: code = NotFound desc = could not find container \"11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129\": container with ID starting with 11ce05e33abebad129c04f0a9e616c16f1bcc8578d9f35c4a4bdaeaedd588129 not found: ID does not exist" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.303827 4803 scope.go:117] "RemoveContainer" containerID="6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60" Mar 20 18:11:11 crc kubenswrapper[4803]: E0320 18:11:11.304227 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60\": container with ID starting with 6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60 not found: ID does not exist" containerID="6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.304268 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60"} err="failed to get container status \"6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60\": rpc error: code = NotFound desc = could not find container \"6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60\": container with ID starting with 6c8da9ade614e9929dcde39bd387bc96b043a5bccae419e749d61951ae505c60 not found: ID does not exist" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.304304 4803 scope.go:117] "RemoveContainer" containerID="a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3" Mar 20 18:11:11 crc kubenswrapper[4803]: E0320 18:11:11.304983 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3\": container with ID starting with a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3 not found: ID does not exist" containerID="a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.305011 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3"} err="failed to get container status \"a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3\": rpc error: code = NotFound desc = could not find container \"a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3\": container with ID starting with a1433d94e26e99f71849c31a62dab70c2e975826faefd4733f4dc151cbfe23d3 not found: ID does not exist" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.635742 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.807438 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q66xw\" (UniqueName: \"kubernetes.io/projected/3bc137df-0509-4deb-8c9e-123b905d4ecf-kube-api-access-q66xw\") pod \"3bc137df-0509-4deb-8c9e-123b905d4ecf\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.807561 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-catalog-content\") pod \"3bc137df-0509-4deb-8c9e-123b905d4ecf\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.807726 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-utilities\") pod \"3bc137df-0509-4deb-8c9e-123b905d4ecf\" (UID: \"3bc137df-0509-4deb-8c9e-123b905d4ecf\") " Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.809348 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-utilities" (OuterVolumeSpecName: "utilities") pod "3bc137df-0509-4deb-8c9e-123b905d4ecf" (UID: "3bc137df-0509-4deb-8c9e-123b905d4ecf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.817438 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bc137df-0509-4deb-8c9e-123b905d4ecf-kube-api-access-q66xw" (OuterVolumeSpecName: "kube-api-access-q66xw") pod "3bc137df-0509-4deb-8c9e-123b905d4ecf" (UID: "3bc137df-0509-4deb-8c9e-123b905d4ecf"). InnerVolumeSpecName "kube-api-access-q66xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.859388 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bc137df-0509-4deb-8c9e-123b905d4ecf" (UID: "3bc137df-0509-4deb-8c9e-123b905d4ecf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.910644 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q66xw\" (UniqueName: \"kubernetes.io/projected/3bc137df-0509-4deb-8c9e-123b905d4ecf-kube-api-access-q66xw\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.910681 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:11 crc kubenswrapper[4803]: I0320 18:11:11.910690 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bc137df-0509-4deb-8c9e-123b905d4ecf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.131823 4803 generic.go:334] "Generic (PLEG): container finished" podID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerID="c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4" exitCode=0 Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.131887 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbrbd" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.131914 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrbd" event={"ID":"3bc137df-0509-4deb-8c9e-123b905d4ecf","Type":"ContainerDied","Data":"c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4"} Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.132317 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbrbd" event={"ID":"3bc137df-0509-4deb-8c9e-123b905d4ecf","Type":"ContainerDied","Data":"9f1b84a4504dab6e549e975aaf865ed6b3f3a91b338f9f04b2ed14f94baffd7e"} Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.132352 4803 scope.go:117] "RemoveContainer" containerID="c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.159706 4803 scope.go:117] "RemoveContainer" containerID="95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.188915 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrbd"] Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.202007 4803 scope.go:117] "RemoveContainer" containerID="5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.207552 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbrbd"] Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.237829 4803 scope.go:117] "RemoveContainer" containerID="c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4" Mar 20 18:11:12 crc kubenswrapper[4803]: E0320 18:11:12.238551 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4\": container with ID starting with c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4 not found: ID does not exist" containerID="c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.238611 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4"} err="failed to get container status \"c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4\": rpc error: code = NotFound desc = could not find container \"c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4\": container with ID starting with c6b3c44d8b0dbee2e4db38c46b42c5467d490931aba2fb8a94622dab91343ce4 not found: ID does not exist" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.238649 4803 scope.go:117] "RemoveContainer" containerID="95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2" Mar 20 18:11:12 crc kubenswrapper[4803]: E0320 18:11:12.239261 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2\": container with ID starting with 95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2 not found: ID does not exist" containerID="95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.239321 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2"} err="failed to get container status \"95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2\": rpc error: code = NotFound desc = could not find container \"95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2\": container with ID starting with 95c256a02086e0e3711bafa2f7bc68c7a0fb1d742a68619f045b39505f1a6aa2 not found: ID does not exist" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.239352 4803 scope.go:117] "RemoveContainer" containerID="5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb" Mar 20 18:11:12 crc kubenswrapper[4803]: E0320 18:11:12.239734 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb\": container with ID starting with 5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb not found: ID does not exist" containerID="5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.239762 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb"} err="failed to get container status \"5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb\": rpc error: code = NotFound desc = could not find container \"5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb\": container with ID starting with 5b06ed900d24b6702a97d44a28563308fd25b84448f3d944cd451dd08040f5fb not found: ID does not exist" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.859481 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" path="/var/lib/kubelet/pods/3bc137df-0509-4deb-8c9e-123b905d4ecf/volumes" Mar 20 18:11:12 crc kubenswrapper[4803]: I0320 18:11:12.860327 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" path="/var/lib/kubelet/pods/91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27/volumes" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.164230 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567172-m6pk2"] Mar 20 18:12:00 crc kubenswrapper[4803]: E0320 18:12:00.165403 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.165423 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4803]: E0320 18:12:00.165452 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.165468 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4803]: E0320 18:12:00.165490 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.165503 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerName="extract-content" Mar 20 18:12:00 crc kubenswrapper[4803]: E0320 18:12:00.165564 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.165577 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4803]: E0320 18:12:00.165604 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.165616 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4803]: E0320 18:12:00.165637 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.165649 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerName="extract-utilities" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.165949 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bc137df-0509-4deb-8c9e-123b905d4ecf" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.165976 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a643d4-e6f2-45a8-bb0b-52bb8a1f6e27" containerName="registry-server" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.167058 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-m6pk2" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.169687 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.170657 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.172844 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.179116 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-m6pk2"] Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.270675 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ppbw\" (UniqueName: \"kubernetes.io/projected/e5d59f1b-69b7-4c69-98f0-58f85f1125c0-kube-api-access-4ppbw\") pod \"auto-csr-approver-29567172-m6pk2\" (UID: \"e5d59f1b-69b7-4c69-98f0-58f85f1125c0\") " pod="openshift-infra/auto-csr-approver-29567172-m6pk2" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.373504 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ppbw\" (UniqueName: \"kubernetes.io/projected/e5d59f1b-69b7-4c69-98f0-58f85f1125c0-kube-api-access-4ppbw\") pod \"auto-csr-approver-29567172-m6pk2\" (UID: \"e5d59f1b-69b7-4c69-98f0-58f85f1125c0\") " pod="openshift-infra/auto-csr-approver-29567172-m6pk2" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.397408 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ppbw\" (UniqueName: \"kubernetes.io/projected/e5d59f1b-69b7-4c69-98f0-58f85f1125c0-kube-api-access-4ppbw\") pod \"auto-csr-approver-29567172-m6pk2\" (UID: \"e5d59f1b-69b7-4c69-98f0-58f85f1125c0\") " pod="openshift-infra/auto-csr-approver-29567172-m6pk2" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.493188 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-m6pk2" Mar 20 18:12:00 crc kubenswrapper[4803]: I0320 18:12:00.983111 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-m6pk2"] Mar 20 18:12:01 crc kubenswrapper[4803]: I0320 18:12:01.716106 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-m6pk2" event={"ID":"e5d59f1b-69b7-4c69-98f0-58f85f1125c0","Type":"ContainerStarted","Data":"f7562e6c0471eed32f43ae1c9c2c316a55994988877679e2c883e38e63fa4360"} Mar 20 18:12:02 crc kubenswrapper[4803]: I0320 18:12:02.729077 4803 generic.go:334] "Generic (PLEG): container finished" podID="e5d59f1b-69b7-4c69-98f0-58f85f1125c0" containerID="90af2e1ee7c5b490ff7c8bde587126b1d1bc681c20fae320b213c82791297e3e" exitCode=0 Mar 20 18:12:02 crc kubenswrapper[4803]: I0320 18:12:02.729182 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-m6pk2" event={"ID":"e5d59f1b-69b7-4c69-98f0-58f85f1125c0","Type":"ContainerDied","Data":"90af2e1ee7c5b490ff7c8bde587126b1d1bc681c20fae320b213c82791297e3e"} Mar 20 18:12:04 crc kubenswrapper[4803]: I0320 18:12:04.144674 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-m6pk2" Mar 20 18:12:04 crc kubenswrapper[4803]: I0320 18:12:04.277858 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ppbw\" (UniqueName: \"kubernetes.io/projected/e5d59f1b-69b7-4c69-98f0-58f85f1125c0-kube-api-access-4ppbw\") pod \"e5d59f1b-69b7-4c69-98f0-58f85f1125c0\" (UID: \"e5d59f1b-69b7-4c69-98f0-58f85f1125c0\") " Mar 20 18:12:04 crc kubenswrapper[4803]: I0320 18:12:04.288115 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d59f1b-69b7-4c69-98f0-58f85f1125c0-kube-api-access-4ppbw" (OuterVolumeSpecName: "kube-api-access-4ppbw") pod "e5d59f1b-69b7-4c69-98f0-58f85f1125c0" (UID: "e5d59f1b-69b7-4c69-98f0-58f85f1125c0"). InnerVolumeSpecName "kube-api-access-4ppbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:12:04 crc kubenswrapper[4803]: I0320 18:12:04.380772 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ppbw\" (UniqueName: \"kubernetes.io/projected/e5d59f1b-69b7-4c69-98f0-58f85f1125c0-kube-api-access-4ppbw\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:04 crc kubenswrapper[4803]: I0320 18:12:04.756292 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567172-m6pk2" event={"ID":"e5d59f1b-69b7-4c69-98f0-58f85f1125c0","Type":"ContainerDied","Data":"f7562e6c0471eed32f43ae1c9c2c316a55994988877679e2c883e38e63fa4360"} Mar 20 18:12:04 crc kubenswrapper[4803]: I0320 18:12:04.756778 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7562e6c0471eed32f43ae1c9c2c316a55994988877679e2c883e38e63fa4360" Mar 20 18:12:04 crc kubenswrapper[4803]: I0320 18:12:04.756375 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567172-m6pk2" Mar 20 18:12:05 crc kubenswrapper[4803]: I0320 18:12:05.236687 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-qf59q"] Mar 20 18:12:05 crc kubenswrapper[4803]: I0320 18:12:05.246261 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567166-qf59q"] Mar 20 18:12:06 crc kubenswrapper[4803]: I0320 18:12:06.860918 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6481af1-e610-404e-a4ab-8ea3d0b98b5a" path="/var/lib/kubelet/pods/e6481af1-e610-404e-a4ab-8ea3d0b98b5a/volumes" Mar 20 18:12:21 crc kubenswrapper[4803]: I0320 18:12:21.927360 4803 generic.go:334] "Generic (PLEG): container finished" podID="6ba5f719-6967-43a1-b544-c27baf20c15b" containerID="f0662d5128585c2f717e7884b477579f75ef9bd1efa8d3800000e72b095ea12e" exitCode=0 Mar 20 18:12:21 crc kubenswrapper[4803]: I0320 18:12:21.927453 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6ba5f719-6967-43a1-b544-c27baf20c15b","Type":"ContainerDied","Data":"f0662d5128585c2f717e7884b477579f75ef9bd1efa8d3800000e72b095ea12e"} Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.358279 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480283 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-workdir\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480325 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz4gs\" (UniqueName: \"kubernetes.io/projected/6ba5f719-6967-43a1-b544-c27baf20c15b-kube-api-access-fz4gs\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480384 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config-secret\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480406 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ca-certs\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480480 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480504 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-temporary\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480652 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-config-data\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480689 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ssh-key\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.480758 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config\") pod \"6ba5f719-6967-43a1-b544-c27baf20c15b\" (UID: \"6ba5f719-6967-43a1-b544-c27baf20c15b\") " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.481970 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-config-data" (OuterVolumeSpecName: "config-data") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.482509 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.486173 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.489787 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba5f719-6967-43a1-b544-c27baf20c15b-kube-api-access-fz4gs" (OuterVolumeSpecName: "kube-api-access-fz4gs") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "kube-api-access-fz4gs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.498438 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.509272 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.516810 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.533774 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.535296 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6ba5f719-6967-43a1-b544-c27baf20c15b" (UID: "6ba5f719-6967-43a1-b544-c27baf20c15b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583143 4803 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583183 4803 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583195 4803 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583204 4803 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583214 4803 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583224 4803 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/6ba5f719-6967-43a1-b544-c27baf20c15b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583232 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz4gs\" (UniqueName: \"kubernetes.io/projected/6ba5f719-6967-43a1-b544-c27baf20c15b-kube-api-access-fz4gs\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583243 4803 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.583250 4803 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/6ba5f719-6967-43a1-b544-c27baf20c15b-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.610679 4803 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.684308 4803 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.949202 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"6ba5f719-6967-43a1-b544-c27baf20c15b","Type":"ContainerDied","Data":"94d746cf621473edf2bd87e8f719caf5539f1373fd13d46f10249bc600246e64"} Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.949244 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94d746cf621473edf2bd87e8f719caf5539f1373fd13d46f10249bc600246e64" Mar 20 18:12:23 crc kubenswrapper[4803]: I0320 18:12:23.949701 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.129152 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:12:34 crc kubenswrapper[4803]: E0320 18:12:34.130504 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d59f1b-69b7-4c69-98f0-58f85f1125c0" containerName="oc" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.130561 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d59f1b-69b7-4c69-98f0-58f85f1125c0" containerName="oc" Mar 20 18:12:34 crc kubenswrapper[4803]: E0320 18:12:34.130643 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba5f719-6967-43a1-b544-c27baf20c15b" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.130663 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba5f719-6967-43a1-b544-c27baf20c15b" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.131061 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d59f1b-69b7-4c69-98f0-58f85f1125c0" containerName="oc" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.131165 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba5f719-6967-43a1-b544-c27baf20c15b" containerName="tempest-tests-tempest-tests-runner" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.133038 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.136350 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m64r5" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.149646 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.292387 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.292474 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ck4\" (UniqueName: \"kubernetes.io/projected/6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6-kube-api-access-47ck4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.394897 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.395028 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ck4\" (UniqueName: \"kubernetes.io/projected/6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6-kube-api-access-47ck4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.395456 4803 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.436105 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ck4\" (UniqueName: \"kubernetes.io/projected/6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6-kube-api-access-47ck4\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.459765 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:34 crc kubenswrapper[4803]: I0320 18:12:34.754103 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 18:12:35 crc kubenswrapper[4803]: I0320 18:12:35.240499 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 18:12:36 crc kubenswrapper[4803]: I0320 18:12:36.069217 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6","Type":"ContainerStarted","Data":"1bec70941e5da8dcca2cb4b43f85419e58830bbd1489d86a5ce42002164e84a5"} Mar 20 18:12:37 crc kubenswrapper[4803]: I0320 18:12:37.084178 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6","Type":"ContainerStarted","Data":"40ea39dc3e5192752d13b12331fd46df745ec94ec4e46b48a471730c05a7d059"} Mar 20 18:12:37 crc kubenswrapper[4803]: I0320 18:12:37.113104 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.86155662 podStartE2EDuration="3.113070077s" podCreationTimestamp="2026-03-20 18:12:34 +0000 UTC" firstStartedPulling="2026-03-20 18:12:35.2421508 +0000 UTC m=+3365.153742910" lastFinishedPulling="2026-03-20 18:12:36.493664287 +0000 UTC m=+3366.405256367" observedRunningTime="2026-03-20 18:12:37.102950753 +0000 UTC m=+3367.014542863" watchObservedRunningTime="2026-03-20 18:12:37.113070077 +0000 UTC m=+3367.024662177" Mar 20 18:12:40 crc kubenswrapper[4803]: I0320 18:12:40.650059 4803 scope.go:117] "RemoveContainer" containerID="034ffccd05b10cedc45cf8e32e8b973a733d945d3cd4514451adefa1d84d701c" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.733737 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vdxlg/must-gather-gtjhq"] Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.735767 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.743540 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vdxlg/must-gather-gtjhq"] Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.747044 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vdxlg"/"openshift-service-ca.crt" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.747279 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vdxlg"/"kube-root-ca.crt" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.862206 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2557091-e342-4378-8dd3-3355ede65628-must-gather-output\") pod \"must-gather-gtjhq\" (UID: \"d2557091-e342-4378-8dd3-3355ede65628\") " pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.862368 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt24n\" (UniqueName: \"kubernetes.io/projected/d2557091-e342-4378-8dd3-3355ede65628-kube-api-access-jt24n\") pod \"must-gather-gtjhq\" (UID: \"d2557091-e342-4378-8dd3-3355ede65628\") " pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.963875 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2557091-e342-4378-8dd3-3355ede65628-must-gather-output\") pod \"must-gather-gtjhq\" (UID: \"d2557091-e342-4378-8dd3-3355ede65628\") " pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.964034 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt24n\" (UniqueName: \"kubernetes.io/projected/d2557091-e342-4378-8dd3-3355ede65628-kube-api-access-jt24n\") pod \"must-gather-gtjhq\" (UID: \"d2557091-e342-4378-8dd3-3355ede65628\") " pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.964343 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2557091-e342-4378-8dd3-3355ede65628-must-gather-output\") pod \"must-gather-gtjhq\" (UID: \"d2557091-e342-4378-8dd3-3355ede65628\") " pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:13:01 crc kubenswrapper[4803]: I0320 18:13:01.980511 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt24n\" (UniqueName: \"kubernetes.io/projected/d2557091-e342-4378-8dd3-3355ede65628-kube-api-access-jt24n\") pod \"must-gather-gtjhq\" (UID: \"d2557091-e342-4378-8dd3-3355ede65628\") " pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:13:02 crc kubenswrapper[4803]: I0320 18:13:02.060411 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:13:02 crc kubenswrapper[4803]: I0320 18:13:02.534791 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vdxlg/must-gather-gtjhq"] Mar 20 18:13:03 crc kubenswrapper[4803]: I0320 18:13:03.379322 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" event={"ID":"d2557091-e342-4378-8dd3-3355ede65628","Type":"ContainerStarted","Data":"8dc1c20ca6f44a55fa4406ed702bfe1b207d53208ea0f6f6847307c3b6f60d51"} Mar 20 18:13:07 crc kubenswrapper[4803]: I0320 18:13:07.431375 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" event={"ID":"d2557091-e342-4378-8dd3-3355ede65628","Type":"ContainerStarted","Data":"e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821"} Mar 20 18:13:07 crc kubenswrapper[4803]: I0320 18:13:07.431832 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" event={"ID":"d2557091-e342-4378-8dd3-3355ede65628","Type":"ContainerStarted","Data":"52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7"} Mar 20 18:13:07 crc kubenswrapper[4803]: I0320 18:13:07.466047 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" podStartSLOduration=2.566538772 podStartE2EDuration="6.465937554s" podCreationTimestamp="2026-03-20 18:13:01 +0000 UTC" firstStartedPulling="2026-03-20 18:13:02.542839528 +0000 UTC m=+3392.454431638" lastFinishedPulling="2026-03-20 18:13:06.44223834 +0000 UTC m=+3396.353830420" observedRunningTime="2026-03-20 18:13:07.462642722 +0000 UTC m=+3397.374234872" watchObservedRunningTime="2026-03-20 18:13:07.465937554 +0000 UTC m=+3397.377529654" Mar 20 18:13:08 crc kubenswrapper[4803]: I0320 18:13:08.246781 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:13:08 crc kubenswrapper[4803]: I0320 18:13:08.247125 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.282027 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-dwchf"] Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.283454 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.285811 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vdxlg"/"default-dockercfg-h9rzl" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.445055 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lnq8\" (UniqueName: \"kubernetes.io/projected/818f8030-11d6-4061-9a83-b2a6b40a61ce-kube-api-access-2lnq8\") pod \"crc-debug-dwchf\" (UID: \"818f8030-11d6-4061-9a83-b2a6b40a61ce\") " pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.445210 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/818f8030-11d6-4061-9a83-b2a6b40a61ce-host\") pod \"crc-debug-dwchf\" (UID: \"818f8030-11d6-4061-9a83-b2a6b40a61ce\") " pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.546914 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/818f8030-11d6-4061-9a83-b2a6b40a61ce-host\") pod \"crc-debug-dwchf\" (UID: \"818f8030-11d6-4061-9a83-b2a6b40a61ce\") " pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.547055 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/818f8030-11d6-4061-9a83-b2a6b40a61ce-host\") pod \"crc-debug-dwchf\" (UID: \"818f8030-11d6-4061-9a83-b2a6b40a61ce\") " pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.547306 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lnq8\" (UniqueName: \"kubernetes.io/projected/818f8030-11d6-4061-9a83-b2a6b40a61ce-kube-api-access-2lnq8\") pod \"crc-debug-dwchf\" (UID: \"818f8030-11d6-4061-9a83-b2a6b40a61ce\") " pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.569806 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lnq8\" (UniqueName: \"kubernetes.io/projected/818f8030-11d6-4061-9a83-b2a6b40a61ce-kube-api-access-2lnq8\") pod \"crc-debug-dwchf\" (UID: \"818f8030-11d6-4061-9a83-b2a6b40a61ce\") " pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:13:10 crc kubenswrapper[4803]: I0320 18:13:10.602298 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:13:11 crc kubenswrapper[4803]: I0320 18:13:11.470232 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/crc-debug-dwchf" event={"ID":"818f8030-11d6-4061-9a83-b2a6b40a61ce","Type":"ContainerStarted","Data":"0fd201a5b5c5278af49f35d7cb20a2485dc6f5bd9a657de48a32cbd67540a6c8"} Mar 20 18:13:21 crc kubenswrapper[4803]: I0320 18:13:21.553015 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/crc-debug-dwchf" event={"ID":"818f8030-11d6-4061-9a83-b2a6b40a61ce","Type":"ContainerStarted","Data":"b7554aa385eb6adcc35bcc83a4be674f3621f2450df4adfe9afe4d22b244c5ee"} Mar 20 18:13:21 crc kubenswrapper[4803]: I0320 18:13:21.567355 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vdxlg/crc-debug-dwchf" podStartSLOduration=1.19000499 podStartE2EDuration="11.567337442s" podCreationTimestamp="2026-03-20 18:13:10 +0000 UTC" firstStartedPulling="2026-03-20 18:13:10.649742194 +0000 UTC m=+3400.561334254" lastFinishedPulling="2026-03-20 18:13:21.027074636 +0000 UTC m=+3410.938666706" observedRunningTime="2026-03-20 18:13:21.566119048 +0000 UTC m=+3411.477711148" watchObservedRunningTime="2026-03-20 18:13:21.567337442 +0000 UTC m=+3411.478929512" Mar 20 18:13:38 crc kubenswrapper[4803]: I0320 18:13:38.246010 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:13:38 crc kubenswrapper[4803]: I0320 18:13:38.246485 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.318202 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7f2kg"] Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.321292 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.330003 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7f2kg"] Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.473634 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4mnc\" (UniqueName: \"kubernetes.io/projected/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-kube-api-access-m4mnc\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.473732 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-utilities\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.473848 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-catalog-content\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.575595 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-catalog-content\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.575763 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4mnc\" (UniqueName: \"kubernetes.io/projected/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-kube-api-access-m4mnc\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.575807 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-utilities\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.576133 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-catalog-content\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.576190 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-utilities\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.603285 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4mnc\" (UniqueName: \"kubernetes.io/projected/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-kube-api-access-m4mnc\") pod \"community-operators-7f2kg\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:41 crc kubenswrapper[4803]: I0320 18:13:41.644071 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:42 crc kubenswrapper[4803]: I0320 18:13:42.249768 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7f2kg"] Mar 20 18:13:42 crc kubenswrapper[4803]: W0320 18:13:42.250180 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf49a5d_5aaa_480a_8ed8_bbe792791b60.slice/crio-f1e197d4a59d823b5f736f90ab2e7c9ded0631922c5a2ea474c291f582cd1534 WatchSource:0}: Error finding container f1e197d4a59d823b5f736f90ab2e7c9ded0631922c5a2ea474c291f582cd1534: Status 404 returned error can't find the container with id f1e197d4a59d823b5f736f90ab2e7c9ded0631922c5a2ea474c291f582cd1534 Mar 20 18:13:42 crc kubenswrapper[4803]: I0320 18:13:42.750420 4803 generic.go:334] "Generic (PLEG): container finished" podID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerID="f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1" exitCode=0 Mar 20 18:13:42 crc kubenswrapper[4803]: I0320 18:13:42.750509 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f2kg" event={"ID":"fcf49a5d-5aaa-480a-8ed8-bbe792791b60","Type":"ContainerDied","Data":"f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1"} Mar 20 18:13:42 crc kubenswrapper[4803]: I0320 18:13:42.750716 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f2kg" event={"ID":"fcf49a5d-5aaa-480a-8ed8-bbe792791b60","Type":"ContainerStarted","Data":"f1e197d4a59d823b5f736f90ab2e7c9ded0631922c5a2ea474c291f582cd1534"} Mar 20 18:13:43 crc kubenswrapper[4803]: I0320 18:13:43.762266 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f2kg" event={"ID":"fcf49a5d-5aaa-480a-8ed8-bbe792791b60","Type":"ContainerStarted","Data":"b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0"} Mar 20 18:13:44 crc kubenswrapper[4803]: I0320 18:13:44.771980 4803 generic.go:334] "Generic (PLEG): container finished" podID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerID="b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0" exitCode=0 Mar 20 18:13:44 crc kubenswrapper[4803]: I0320 18:13:44.772032 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f2kg" event={"ID":"fcf49a5d-5aaa-480a-8ed8-bbe792791b60","Type":"ContainerDied","Data":"b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0"} Mar 20 18:13:45 crc kubenswrapper[4803]: I0320 18:13:45.783500 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f2kg" event={"ID":"fcf49a5d-5aaa-480a-8ed8-bbe792791b60","Type":"ContainerStarted","Data":"e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd"} Mar 20 18:13:45 crc kubenswrapper[4803]: I0320 18:13:45.802651 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7f2kg" podStartSLOduration=2.148848457 podStartE2EDuration="4.80262675s" podCreationTimestamp="2026-03-20 18:13:41 +0000 UTC" firstStartedPulling="2026-03-20 18:13:42.752075516 +0000 UTC m=+3432.663667586" lastFinishedPulling="2026-03-20 18:13:45.405853809 +0000 UTC m=+3435.317445879" observedRunningTime="2026-03-20 18:13:45.800397638 +0000 UTC m=+3435.711989718" watchObservedRunningTime="2026-03-20 18:13:45.80262675 +0000 UTC m=+3435.714218820" Mar 20 18:13:51 crc kubenswrapper[4803]: I0320 18:13:51.647849 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:51 crc kubenswrapper[4803]: I0320 18:13:51.648302 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:51 crc kubenswrapper[4803]: I0320 18:13:51.697692 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:51 crc kubenswrapper[4803]: I0320 18:13:51.891806 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.006731 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7f2kg"] Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.007208 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7f2kg" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerName="registry-server" containerID="cri-o://e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd" gracePeriod=2 Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.463075 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.560015 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4mnc\" (UniqueName: \"kubernetes.io/projected/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-kube-api-access-m4mnc\") pod \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.560342 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-utilities\") pod \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.560413 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-catalog-content\") pod \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\" (UID: \"fcf49a5d-5aaa-480a-8ed8-bbe792791b60\") " Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.561271 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-utilities" (OuterVolumeSpecName: "utilities") pod "fcf49a5d-5aaa-480a-8ed8-bbe792791b60" (UID: "fcf49a5d-5aaa-480a-8ed8-bbe792791b60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.574912 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-kube-api-access-m4mnc" (OuterVolumeSpecName: "kube-api-access-m4mnc") pod "fcf49a5d-5aaa-480a-8ed8-bbe792791b60" (UID: "fcf49a5d-5aaa-480a-8ed8-bbe792791b60"). InnerVolumeSpecName "kube-api-access-m4mnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.622127 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcf49a5d-5aaa-480a-8ed8-bbe792791b60" (UID: "fcf49a5d-5aaa-480a-8ed8-bbe792791b60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.664242 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4mnc\" (UniqueName: \"kubernetes.io/projected/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-kube-api-access-m4mnc\") on node \"crc\" DevicePath \"\"" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.664275 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.664285 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf49a5d-5aaa-480a-8ed8-bbe792791b60-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.878191 4803 generic.go:334] "Generic (PLEG): container finished" podID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerID="e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd" exitCode=0 Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.878233 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f2kg" event={"ID":"fcf49a5d-5aaa-480a-8ed8-bbe792791b60","Type":"ContainerDied","Data":"e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd"} Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.878269 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7f2kg" event={"ID":"fcf49a5d-5aaa-480a-8ed8-bbe792791b60","Type":"ContainerDied","Data":"f1e197d4a59d823b5f736f90ab2e7c9ded0631922c5a2ea474c291f582cd1534"} Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.878287 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7f2kg" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.878300 4803 scope.go:117] "RemoveContainer" containerID="e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.896606 4803 scope.go:117] "RemoveContainer" containerID="b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.951124 4803 scope.go:117] "RemoveContainer" containerID="f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.957594 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7f2kg"] Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.967730 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7f2kg"] Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.999198 4803 scope.go:117] "RemoveContainer" containerID="e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd" Mar 20 18:13:55 crc kubenswrapper[4803]: E0320 18:13:55.999838 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd\": container with ID starting with e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd not found: ID does not exist" containerID="e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd" Mar 20 18:13:55 crc kubenswrapper[4803]: I0320 18:13:55.999877 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd"} err="failed to get container status \"e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd\": rpc error: code = NotFound desc = could not find container \"e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd\": container with ID starting with e0935d27bb3c1af7c2c8c10546110c6d7e4a50a2cdde02820ffa7c0fb36054dd not found: ID does not exist" Mar 20 18:13:56 crc kubenswrapper[4803]: I0320 18:13:55.999902 4803 scope.go:117] "RemoveContainer" containerID="b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0" Mar 20 18:13:56 crc kubenswrapper[4803]: E0320 18:13:56.000235 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0\": container with ID starting with b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0 not found: ID does not exist" containerID="b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0" Mar 20 18:13:56 crc kubenswrapper[4803]: I0320 18:13:56.000259 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0"} err="failed to get container status \"b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0\": rpc error: code = NotFound desc = could not find container \"b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0\": container with ID starting with b496ffd1fb7175986fac9c31ce13148f0520dbe2aac2ffd6c4ee4f39b29a4ed0 not found: ID does not exist" Mar 20 18:13:56 crc kubenswrapper[4803]: I0320 18:13:56.000300 4803 scope.go:117] "RemoveContainer" containerID="f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1" Mar 20 18:13:56 crc kubenswrapper[4803]: E0320 18:13:56.000740 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1\": container with ID starting with f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1 not found: ID does not exist" containerID="f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1" Mar 20 18:13:56 crc kubenswrapper[4803]: I0320 18:13:56.000774 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1"} err="failed to get container status \"f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1\": rpc error: code = NotFound desc = could not find container \"f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1\": container with ID starting with f756861ffa5d9e5b037d003435c5d4de1df02982220cfadc3fae1ee6c55e69e1 not found: ID does not exist" Mar 20 18:13:56 crc kubenswrapper[4803]: I0320 18:13:56.857614 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" path="/var/lib/kubelet/pods/fcf49a5d-5aaa-480a-8ed8-bbe792791b60/volumes" Mar 20 18:13:58 crc kubenswrapper[4803]: I0320 18:13:58.907638 4803 generic.go:334] "Generic (PLEG): container finished" podID="818f8030-11d6-4061-9a83-b2a6b40a61ce" containerID="b7554aa385eb6adcc35bcc83a4be674f3621f2450df4adfe9afe4d22b244c5ee" exitCode=0 Mar 20 18:13:58 crc kubenswrapper[4803]: I0320 18:13:58.907726 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/crc-debug-dwchf" event={"ID":"818f8030-11d6-4061-9a83-b2a6b40a61ce","Type":"ContainerDied","Data":"b7554aa385eb6adcc35bcc83a4be674f3621f2450df4adfe9afe4d22b244c5ee"} Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.015830 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.064706 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-dwchf"] Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.076179 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-dwchf"] Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.107940 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lnq8\" (UniqueName: \"kubernetes.io/projected/818f8030-11d6-4061-9a83-b2a6b40a61ce-kube-api-access-2lnq8\") pod \"818f8030-11d6-4061-9a83-b2a6b40a61ce\" (UID: \"818f8030-11d6-4061-9a83-b2a6b40a61ce\") " Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.108031 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/818f8030-11d6-4061-9a83-b2a6b40a61ce-host\") pod \"818f8030-11d6-4061-9a83-b2a6b40a61ce\" (UID: \"818f8030-11d6-4061-9a83-b2a6b40a61ce\") " Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.108442 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/818f8030-11d6-4061-9a83-b2a6b40a61ce-host" (OuterVolumeSpecName: "host") pod "818f8030-11d6-4061-9a83-b2a6b40a61ce" (UID: "818f8030-11d6-4061-9a83-b2a6b40a61ce"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.115405 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/818f8030-11d6-4061-9a83-b2a6b40a61ce-kube-api-access-2lnq8" (OuterVolumeSpecName: "kube-api-access-2lnq8") pod "818f8030-11d6-4061-9a83-b2a6b40a61ce" (UID: "818f8030-11d6-4061-9a83-b2a6b40a61ce"). InnerVolumeSpecName "kube-api-access-2lnq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.142349 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567174-2h6sj"] Mar 20 18:14:00 crc kubenswrapper[4803]: E0320 18:14:00.146112 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerName="extract-content" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.146145 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerName="extract-content" Mar 20 18:14:00 crc kubenswrapper[4803]: E0320 18:14:00.146160 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerName="registry-server" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.146174 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerName="registry-server" Mar 20 18:14:00 crc kubenswrapper[4803]: E0320 18:14:00.146185 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="818f8030-11d6-4061-9a83-b2a6b40a61ce" containerName="container-00" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.146191 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="818f8030-11d6-4061-9a83-b2a6b40a61ce" containerName="container-00" Mar 20 18:14:00 crc kubenswrapper[4803]: E0320 18:14:00.146209 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerName="extract-utilities" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.146215 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerName="extract-utilities" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.146406 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="818f8030-11d6-4061-9a83-b2a6b40a61ce" containerName="container-00" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.146427 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf49a5d-5aaa-480a-8ed8-bbe792791b60" containerName="registry-server" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.147468 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-2h6sj" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.151545 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.151721 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.152136 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.159706 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-2h6sj"] Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.210555 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6m5n\" (UniqueName: \"kubernetes.io/projected/5fe937aa-96cd-4b26-b569-f4962670c7de-kube-api-access-f6m5n\") pod \"auto-csr-approver-29567174-2h6sj\" (UID: \"5fe937aa-96cd-4b26-b569-f4962670c7de\") " pod="openshift-infra/auto-csr-approver-29567174-2h6sj" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.210714 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lnq8\" (UniqueName: \"kubernetes.io/projected/818f8030-11d6-4061-9a83-b2a6b40a61ce-kube-api-access-2lnq8\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.210763 4803 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/818f8030-11d6-4061-9a83-b2a6b40a61ce-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.312204 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6m5n\" (UniqueName: \"kubernetes.io/projected/5fe937aa-96cd-4b26-b569-f4962670c7de-kube-api-access-f6m5n\") pod \"auto-csr-approver-29567174-2h6sj\" (UID: \"5fe937aa-96cd-4b26-b569-f4962670c7de\") " pod="openshift-infra/auto-csr-approver-29567174-2h6sj" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.334451 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6m5n\" (UniqueName: \"kubernetes.io/projected/5fe937aa-96cd-4b26-b569-f4962670c7de-kube-api-access-f6m5n\") pod \"auto-csr-approver-29567174-2h6sj\" (UID: \"5fe937aa-96cd-4b26-b569-f4962670c7de\") " pod="openshift-infra/auto-csr-approver-29567174-2h6sj" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.506329 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-2h6sj" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.860722 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="818f8030-11d6-4061-9a83-b2a6b40a61ce" path="/var/lib/kubelet/pods/818f8030-11d6-4061-9a83-b2a6b40a61ce/volumes" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.928353 4803 scope.go:117] "RemoveContainer" containerID="b7554aa385eb6adcc35bcc83a4be674f3621f2450df4adfe9afe4d22b244c5ee" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.928487 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-dwchf" Mar 20 18:14:00 crc kubenswrapper[4803]: I0320 18:14:00.960929 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-2h6sj"] Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.231532 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-9qg6c"] Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.233808 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.237198 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vdxlg"/"default-dockercfg-h9rzl" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.331297 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-host\") pod \"crc-debug-9qg6c\" (UID: \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\") " pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.331352 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2hb\" (UniqueName: \"kubernetes.io/projected/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-kube-api-access-zm2hb\") pod \"crc-debug-9qg6c\" (UID: \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\") " pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.433340 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-host\") pod \"crc-debug-9qg6c\" (UID: \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\") " pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.433409 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2hb\" (UniqueName: \"kubernetes.io/projected/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-kube-api-access-zm2hb\") pod \"crc-debug-9qg6c\" (UID: \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\") " pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.433496 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-host\") pod \"crc-debug-9qg6c\" (UID: \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\") " pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.473481 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2hb\" (UniqueName: \"kubernetes.io/projected/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-kube-api-access-zm2hb\") pod \"crc-debug-9qg6c\" (UID: \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\") " pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.554954 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.939392 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-2h6sj" event={"ID":"5fe937aa-96cd-4b26-b569-f4962670c7de","Type":"ContainerStarted","Data":"1cb8512dc3a4f7c90650946c93f883386ba92c7122cd13f43c8ab464843ef2a4"} Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.941151 4803 generic.go:334] "Generic (PLEG): container finished" podID="3614b0b9-8c41-435f-a41f-4ebb4dacc0c4" containerID="e22a845b17b7db3e96c894d6fee547d3bb128225c56d993350a97bcc090bf868" exitCode=0 Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.941228 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" event={"ID":"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4","Type":"ContainerDied","Data":"e22a845b17b7db3e96c894d6fee547d3bb128225c56d993350a97bcc090bf868"} Mar 20 18:14:01 crc kubenswrapper[4803]: I0320 18:14:01.941257 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" event={"ID":"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4","Type":"ContainerStarted","Data":"1c6afee16695898daf990a7b0d9380b6e5e1134c3ff6f2b489b5509425fecdd3"} Mar 20 18:14:02 crc kubenswrapper[4803]: I0320 18:14:02.357590 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-9qg6c"] Mar 20 18:14:02 crc kubenswrapper[4803]: I0320 18:14:02.369094 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-9qg6c"] Mar 20 18:14:02 crc kubenswrapper[4803]: I0320 18:14:02.973209 4803 generic.go:334] "Generic (PLEG): container finished" podID="5fe937aa-96cd-4b26-b569-f4962670c7de" containerID="6a6e5bdb0913725740d8118be8e75cbe27b4a489d895a0dc72b3efbcdfecbb44" exitCode=0 Mar 20 18:14:02 crc kubenswrapper[4803]: I0320 18:14:02.973801 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-2h6sj" event={"ID":"5fe937aa-96cd-4b26-b569-f4962670c7de","Type":"ContainerDied","Data":"6a6e5bdb0913725740d8118be8e75cbe27b4a489d895a0dc72b3efbcdfecbb44"} Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.082955 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.169039 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm2hb\" (UniqueName: \"kubernetes.io/projected/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-kube-api-access-zm2hb\") pod \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\" (UID: \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\") " Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.169159 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-host\") pod \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\" (UID: \"3614b0b9-8c41-435f-a41f-4ebb4dacc0c4\") " Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.169665 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-host" (OuterVolumeSpecName: "host") pod "3614b0b9-8c41-435f-a41f-4ebb4dacc0c4" (UID: "3614b0b9-8c41-435f-a41f-4ebb4dacc0c4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.177685 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-kube-api-access-zm2hb" (OuterVolumeSpecName: "kube-api-access-zm2hb") pod "3614b0b9-8c41-435f-a41f-4ebb4dacc0c4" (UID: "3614b0b9-8c41-435f-a41f-4ebb4dacc0c4"). InnerVolumeSpecName "kube-api-access-zm2hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.271210 4803 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.271250 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm2hb\" (UniqueName: \"kubernetes.io/projected/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4-kube-api-access-zm2hb\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.541044 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-ldc6t"] Mar 20 18:14:03 crc kubenswrapper[4803]: E0320 18:14:03.542325 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3614b0b9-8c41-435f-a41f-4ebb4dacc0c4" containerName="container-00" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.542405 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="3614b0b9-8c41-435f-a41f-4ebb4dacc0c4" containerName="container-00" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.542702 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="3614b0b9-8c41-435f-a41f-4ebb4dacc0c4" containerName="container-00" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.543967 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.575606 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32a3e053-dea6-408f-a69b-bb1b84daadaa-host\") pod \"crc-debug-ldc6t\" (UID: \"32a3e053-dea6-408f-a69b-bb1b84daadaa\") " pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.575675 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtstk\" (UniqueName: \"kubernetes.io/projected/32a3e053-dea6-408f-a69b-bb1b84daadaa-kube-api-access-wtstk\") pod \"crc-debug-ldc6t\" (UID: \"32a3e053-dea6-408f-a69b-bb1b84daadaa\") " pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.677429 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32a3e053-dea6-408f-a69b-bb1b84daadaa-host\") pod \"crc-debug-ldc6t\" (UID: \"32a3e053-dea6-408f-a69b-bb1b84daadaa\") " pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.677781 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtstk\" (UniqueName: \"kubernetes.io/projected/32a3e053-dea6-408f-a69b-bb1b84daadaa-kube-api-access-wtstk\") pod \"crc-debug-ldc6t\" (UID: \"32a3e053-dea6-408f-a69b-bb1b84daadaa\") " pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.677607 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32a3e053-dea6-408f-a69b-bb1b84daadaa-host\") pod \"crc-debug-ldc6t\" (UID: \"32a3e053-dea6-408f-a69b-bb1b84daadaa\") " pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.706937 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtstk\" (UniqueName: \"kubernetes.io/projected/32a3e053-dea6-408f-a69b-bb1b84daadaa-kube-api-access-wtstk\") pod \"crc-debug-ldc6t\" (UID: \"32a3e053-dea6-408f-a69b-bb1b84daadaa\") " pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.859271 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:03 crc kubenswrapper[4803]: W0320 18:14:03.888140 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a3e053_dea6_408f_a69b_bb1b84daadaa.slice/crio-82cefda167ca6ae4b5df6812a02caf9ee69432640e7cd40313eedbc914267fcd WatchSource:0}: Error finding container 82cefda167ca6ae4b5df6812a02caf9ee69432640e7cd40313eedbc914267fcd: Status 404 returned error can't find the container with id 82cefda167ca6ae4b5df6812a02caf9ee69432640e7cd40313eedbc914267fcd Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.983742 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" event={"ID":"32a3e053-dea6-408f-a69b-bb1b84daadaa","Type":"ContainerStarted","Data":"82cefda167ca6ae4b5df6812a02caf9ee69432640e7cd40313eedbc914267fcd"} Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.985471 4803 scope.go:117] "RemoveContainer" containerID="e22a845b17b7db3e96c894d6fee547d3bb128225c56d993350a97bcc090bf868" Mar 20 18:14:03 crc kubenswrapper[4803]: I0320 18:14:03.985473 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-9qg6c" Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.349105 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-2h6sj" Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.391756 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6m5n\" (UniqueName: \"kubernetes.io/projected/5fe937aa-96cd-4b26-b569-f4962670c7de-kube-api-access-f6m5n\") pod \"5fe937aa-96cd-4b26-b569-f4962670c7de\" (UID: \"5fe937aa-96cd-4b26-b569-f4962670c7de\") " Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.405765 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe937aa-96cd-4b26-b569-f4962670c7de-kube-api-access-f6m5n" (OuterVolumeSpecName: "kube-api-access-f6m5n") pod "5fe937aa-96cd-4b26-b569-f4962670c7de" (UID: "5fe937aa-96cd-4b26-b569-f4962670c7de"). InnerVolumeSpecName "kube-api-access-f6m5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.493970 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6m5n\" (UniqueName: \"kubernetes.io/projected/5fe937aa-96cd-4b26-b569-f4962670c7de-kube-api-access-f6m5n\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.859257 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3614b0b9-8c41-435f-a41f-4ebb4dacc0c4" path="/var/lib/kubelet/pods/3614b0b9-8c41-435f-a41f-4ebb4dacc0c4/volumes" Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.993989 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567174-2h6sj" event={"ID":"5fe937aa-96cd-4b26-b569-f4962670c7de","Type":"ContainerDied","Data":"1cb8512dc3a4f7c90650946c93f883386ba92c7122cd13f43c8ab464843ef2a4"} Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.994031 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cb8512dc3a4f7c90650946c93f883386ba92c7122cd13f43c8ab464843ef2a4" Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.994010 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567174-2h6sj" Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.995852 4803 generic.go:334] "Generic (PLEG): container finished" podID="32a3e053-dea6-408f-a69b-bb1b84daadaa" containerID="965bea84a15ae43df694eec7dfcdbd24c6cfeac964993f2fa12c8c969afe124e" exitCode=0 Mar 20 18:14:04 crc kubenswrapper[4803]: I0320 18:14:04.995927 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" event={"ID":"32a3e053-dea6-408f-a69b-bb1b84daadaa","Type":"ContainerDied","Data":"965bea84a15ae43df694eec7dfcdbd24c6cfeac964993f2fa12c8c969afe124e"} Mar 20 18:14:05 crc kubenswrapper[4803]: I0320 18:14:05.037409 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-ldc6t"] Mar 20 18:14:05 crc kubenswrapper[4803]: I0320 18:14:05.048147 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vdxlg/crc-debug-ldc6t"] Mar 20 18:14:05 crc kubenswrapper[4803]: I0320 18:14:05.436469 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-wfvfr"] Mar 20 18:14:05 crc kubenswrapper[4803]: I0320 18:14:05.452973 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567168-wfvfr"] Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.124435 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.324684 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32a3e053-dea6-408f-a69b-bb1b84daadaa-host\") pod \"32a3e053-dea6-408f-a69b-bb1b84daadaa\" (UID: \"32a3e053-dea6-408f-a69b-bb1b84daadaa\") " Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.324845 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32a3e053-dea6-408f-a69b-bb1b84daadaa-host" (OuterVolumeSpecName: "host") pod "32a3e053-dea6-408f-a69b-bb1b84daadaa" (UID: "32a3e053-dea6-408f-a69b-bb1b84daadaa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.324984 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtstk\" (UniqueName: \"kubernetes.io/projected/32a3e053-dea6-408f-a69b-bb1b84daadaa-kube-api-access-wtstk\") pod \"32a3e053-dea6-408f-a69b-bb1b84daadaa\" (UID: \"32a3e053-dea6-408f-a69b-bb1b84daadaa\") " Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.325681 4803 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/32a3e053-dea6-408f-a69b-bb1b84daadaa-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.330818 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a3e053-dea6-408f-a69b-bb1b84daadaa-kube-api-access-wtstk" (OuterVolumeSpecName: "kube-api-access-wtstk") pod "32a3e053-dea6-408f-a69b-bb1b84daadaa" (UID: "32a3e053-dea6-408f-a69b-bb1b84daadaa"). InnerVolumeSpecName "kube-api-access-wtstk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.428169 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtstk\" (UniqueName: \"kubernetes.io/projected/32a3e053-dea6-408f-a69b-bb1b84daadaa-kube-api-access-wtstk\") on node \"crc\" DevicePath \"\"" Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.883338 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a3e053-dea6-408f-a69b-bb1b84daadaa" path="/var/lib/kubelet/pods/32a3e053-dea6-408f-a69b-bb1b84daadaa/volumes" Mar 20 18:14:06 crc kubenswrapper[4803]: I0320 18:14:06.884320 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="747b56de-18b6-4ee4-b404-b93c6061c80b" path="/var/lib/kubelet/pods/747b56de-18b6-4ee4-b404-b93c6061c80b/volumes" Mar 20 18:14:06 crc kubenswrapper[4803]: E0320 18:14:06.988184 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32a3e053_dea6_408f_a69b_bb1b84daadaa.slice\": RecentStats: unable to find data in memory cache]" Mar 20 18:14:07 crc kubenswrapper[4803]: I0320 18:14:07.015692 4803 scope.go:117] "RemoveContainer" containerID="965bea84a15ae43df694eec7dfcdbd24c6cfeac964993f2fa12c8c969afe124e" Mar 20 18:14:07 crc kubenswrapper[4803]: I0320 18:14:07.015739 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/crc-debug-ldc6t" Mar 20 18:14:08 crc kubenswrapper[4803]: I0320 18:14:08.245690 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:14:08 crc kubenswrapper[4803]: I0320 18:14:08.245967 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:14:08 crc kubenswrapper[4803]: I0320 18:14:08.246008 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 18:14:08 crc kubenswrapper[4803]: I0320 18:14:08.246608 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2f3cc3c4717154afbdb4abc9029591c46f46936ced43a499885d3ba9d07fcb2b"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:14:08 crc kubenswrapper[4803]: I0320 18:14:08.246655 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://2f3cc3c4717154afbdb4abc9029591c46f46936ced43a499885d3ba9d07fcb2b" gracePeriod=600 Mar 20 18:14:09 crc kubenswrapper[4803]: I0320 18:14:09.038208 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="2f3cc3c4717154afbdb4abc9029591c46f46936ced43a499885d3ba9d07fcb2b" exitCode=0 Mar 20 18:14:09 crc kubenswrapper[4803]: I0320 18:14:09.038301 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"2f3cc3c4717154afbdb4abc9029591c46f46936ced43a499885d3ba9d07fcb2b"} Mar 20 18:14:09 crc kubenswrapper[4803]: I0320 18:14:09.038916 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f"} Mar 20 18:14:09 crc kubenswrapper[4803]: I0320 18:14:09.038947 4803 scope.go:117] "RemoveContainer" containerID="aa6c5a9abd08b0d30446fbb7f767a11340b5067a34c0a4163dcef9ca6d33afa4" Mar 20 18:14:21 crc kubenswrapper[4803]: I0320 18:14:21.539613 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-648ff8c756-h8wjt_deeb9bec-8bcb-48ef-ba67-b5772825f753/barbican-api/0.log" Mar 20 18:14:21 crc kubenswrapper[4803]: I0320 18:14:21.634465 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-648ff8c756-h8wjt_deeb9bec-8bcb-48ef-ba67-b5772825f753/barbican-api-log/0.log" Mar 20 18:14:21 crc kubenswrapper[4803]: I0320 18:14:21.719887 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b99997f8b-mzkpj_726567e8-cdfd-4fe4-985f-1cf10a787994/barbican-keystone-listener/0.log" Mar 20 18:14:21 crc kubenswrapper[4803]: I0320 18:14:21.792406 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b99997f8b-mzkpj_726567e8-cdfd-4fe4-985f-1cf10a787994/barbican-keystone-listener-log/0.log" Mar 20 18:14:21 crc kubenswrapper[4803]: I0320 18:14:21.885349 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bfd9f5f6f-xmdsr_43090279-19af-4393-ab9a-1092aae61875/barbican-worker/0.log" Mar 20 18:14:21 crc kubenswrapper[4803]: I0320 18:14:21.964135 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bfd9f5f6f-xmdsr_43090279-19af-4393-ab9a-1092aae61875/barbican-worker-log/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.147942 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn_0c2a599d-17eb-4116-8b3e-a9adc8a7568b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.186404 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a7417bc-8901-4f01-ae88-5b304c7371a9/ceilometer-central-agent/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.232940 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a7417bc-8901-4f01-ae88-5b304c7371a9/ceilometer-notification-agent/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.292599 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a7417bc-8901-4f01-ae88-5b304c7371a9/proxy-httpd/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.343474 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a7417bc-8901-4f01-ae88-5b304c7371a9/sg-core/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.477479 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_035c521a-8bdf-4489-a429-3629df54ca84/cinder-api-log/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.488371 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_035c521a-8bdf-4489-a429-3629df54ca84/cinder-api/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.633070 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_78208796-d6b5-472e-9f4b-0f582d5bcfc9/cinder-scheduler/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.671397 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_78208796-d6b5-472e-9f4b-0f582d5bcfc9/probe/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.825052 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5_04e7436c-88e7-4d1b-b078-cbee6adf422d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:22 crc kubenswrapper[4803]: I0320 18:14:22.982900 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh_7bb67428-717c-47fc-9ab1-b94a5c502298/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.042792 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-hh5vh_6516522c-430f-476f-8471-d5b39263571f/init/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.175937 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-hh5vh_6516522c-430f-476f-8471-d5b39263571f/init/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.234501 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-hh5vh_6516522c-430f-476f-8471-d5b39263571f/dnsmasq-dns/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.335895 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn_c2c45e4b-27e5-4614-875f-838444583617/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.452848 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_de8af7e5-2a44-4caa-883c-7ef2027e69c4/glance-httpd/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.460711 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_de8af7e5-2a44-4caa-883c-7ef2027e69c4/glance-log/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.627298 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_27ce9fd0-5867-4d7f-aec4-70784b898289/glance-log/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.660763 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_27ce9fd0-5867-4d7f-aec4-70784b898289/glance-httpd/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.873245 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-596cfc5b56-w5pbk_76a924c8-a380-4ee2-a6ce-ac77f0979f24/horizon/0.log" Mar 20 18:14:23 crc kubenswrapper[4803]: I0320 18:14:23.946193 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf_11eacc86-5400-4614-bfa1-353a4c9a4ef8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:24 crc kubenswrapper[4803]: I0320 18:14:24.179157 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-596cfc5b56-w5pbk_76a924c8-a380-4ee2-a6ce-ac77f0979f24/horizon-log/0.log" Mar 20 18:14:24 crc kubenswrapper[4803]: I0320 18:14:24.321871 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-26xq7_de5b2da1-3a72-4b62-98a5-71352eb71c90/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:24 crc kubenswrapper[4803]: I0320 18:14:24.433416 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567161-685pt_5b706f4a-4386-4911-a201-5cbf5e7bd916/keystone-cron/0.log" Mar 20 18:14:24 crc kubenswrapper[4803]: I0320 18:14:24.515534 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85847589bb-9pbbf_42bf225e-cdca-4bc8-922d-8ff2bcb6ff17/keystone-api/0.log" Mar 20 18:14:24 crc kubenswrapper[4803]: I0320 18:14:24.624830 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e13d6de9-6ef6-4194-98da-c8fee814f4d1/kube-state-metrics/0.log" Mar 20 18:14:25 crc kubenswrapper[4803]: I0320 18:14:25.244746 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz_f644be5e-0ec5-499a-a42d-b4381159e310/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:25 crc kubenswrapper[4803]: I0320 18:14:25.254360 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b69588b57-phch4_9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6/neutron-httpd/0.log" Mar 20 18:14:25 crc kubenswrapper[4803]: I0320 18:14:25.271638 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b69588b57-phch4_9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6/neutron-api/0.log" Mar 20 18:14:25 crc kubenswrapper[4803]: I0320 18:14:25.573576 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc_6aabf807-6d9b-4e6a-99db-ddfda9979b25/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:25 crc kubenswrapper[4803]: I0320 18:14:25.945683 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b56a359b-06fc-40e2-b128-c2427461160a/nova-api-log/0.log" Mar 20 18:14:25 crc kubenswrapper[4803]: I0320 18:14:25.980251 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ad2e7092-e2b5-4f0f-bde6-892cb3660837/nova-cell0-conductor-conductor/0.log" Mar 20 18:14:26 crc kubenswrapper[4803]: I0320 18:14:26.209206 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b56a359b-06fc-40e2-b128-c2427461160a/nova-api-api/0.log" Mar 20 18:14:26 crc kubenswrapper[4803]: I0320 18:14:26.276576 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_62cc6e29-7f2d-4c3d-b32c-4b906520eded/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 18:14:26 crc kubenswrapper[4803]: I0320 18:14:26.298463 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_76a50897-11fb-467d-8b45-dde2ad95b1fd/nova-cell1-conductor-conductor/0.log" Mar 20 18:14:26 crc kubenswrapper[4803]: I0320 18:14:26.629979 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9461d82d-3e47-4dea-889c-a82c7f4a97b4/nova-metadata-log/0.log" Mar 20 18:14:26 crc kubenswrapper[4803]: I0320 18:14:26.992168 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9nrx2_0b06d67e-e61d-499b-bce7-41b3f0b3b509/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.000967 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_51598f47-53eb-4d5d-917d-3655d7e200e8/nova-scheduler-scheduler/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.023659 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9461d82d-3e47-4dea-889c-a82c7f4a97b4/nova-metadata-metadata/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.133169 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f1b71013-6f7a-4559-a3bc-af90c284cada/mysql-bootstrap/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.308769 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f1b71013-6f7a-4559-a3bc-af90c284cada/mysql-bootstrap/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.364777 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_be331baf-1bef-41ab-ac10-b8686ecb5a30/mysql-bootstrap/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.390872 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f1b71013-6f7a-4559-a3bc-af90c284cada/galera/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.599417 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_be331baf-1bef-41ab-ac10-b8686ecb5a30/mysql-bootstrap/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.603578 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_be331baf-1bef-41ab-ac10-b8686ecb5a30/galera/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.626415 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d15f9821-6db3-4c08-b731-d0c7349b4076/openstackclient/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.832549 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-56z85_1d21c995-a420-4ac7-8cd9-c186be9e4ba0/ovn-controller/0.log" Mar 20 18:14:27 crc kubenswrapper[4803]: I0320 18:14:27.854293 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kgbvn_7cc9f89e-5d0f-4f92-93fa-c7a0133baf05/openstack-network-exporter/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.047339 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l7cc6_d0bcca9a-5da9-4ffc-897f-e3a0ae093324/ovsdb-server-init/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.227057 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l7cc6_d0bcca9a-5da9-4ffc-897f-e3a0ae093324/ovs-vswitchd/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.264271 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l7cc6_d0bcca9a-5da9-4ffc-897f-e3a0ae093324/ovsdb-server/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.275439 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l7cc6_d0bcca9a-5da9-4ffc-897f-e3a0ae093324/ovsdb-server-init/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.512145 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-knxqc_6bd23d21-aeac-4394-b83d-befbd825d5ce/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.523398 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_02135c00-5c50-45c9-a206-85f9e60d9c6e/openstack-network-exporter/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.542676 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_02135c00-5c50-45c9-a206-85f9e60d9c6e/ovn-northd/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.692434 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ab4e6fd1-195c-4ff0-8288-14454e4ea4f1/openstack-network-exporter/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.709614 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ab4e6fd1-195c-4ff0-8288-14454e4ea4f1/ovsdbserver-nb/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.908091 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7c9af9eb-0645-4fe0-a558-6ae86595685e/ovsdbserver-sb/0.log" Mar 20 18:14:28 crc kubenswrapper[4803]: I0320 18:14:28.921797 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7c9af9eb-0645-4fe0-a558-6ae86595685e/openstack-network-exporter/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.100481 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5984b77b84-4dhqv_d3ebc450-0e00-492b-a5ab-f02c63aa4071/placement-api/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.130872 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6450d307-a4cf-4d3c-acdd-31a50aec6109/setup-container/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.159438 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5984b77b84-4dhqv_d3ebc450-0e00-492b-a5ab-f02c63aa4071/placement-log/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.329551 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6450d307-a4cf-4d3c-acdd-31a50aec6109/setup-container/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.393847 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6450d307-a4cf-4d3c-acdd-31a50aec6109/rabbitmq/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.443159 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72227685-667c-47b8-aedb-0329dd683bc0/setup-container/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.669620 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72227685-667c-47b8-aedb-0329dd683bc0/setup-container/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.687272 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72227685-667c-47b8-aedb-0329dd683bc0/rabbitmq/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.696085 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp_6091d3e2-8164-4c5c-b5b9-9299dbe203d5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.936257 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5_be466e4c-5034-4784-9c39-a390b28adb2e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:29 crc kubenswrapper[4803]: I0320 18:14:29.946876 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qtns4_5daa4111-ac59-4390-b425-d56ac571c768/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.162189 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gfh5h_2aed2dd6-8792-412c-972f-6e7e4e4bae0e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.190806 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-54x78_8a143d0c-3c8c-4426-8b98-1309a281aaf8/ssh-known-hosts-edpm-deployment/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.379503 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5598996667-w8j76_8b611951-3d24-49d2-a5bc-18d41e478610/proxy-server/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.431723 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5598996667-w8j76_8b611951-3d24-49d2-a5bc-18d41e478610/proxy-httpd/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.557981 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2d9wv_e846bdbc-0d6d-4dc2-9a0b-d188913b5eda/swift-ring-rebalance/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.615698 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/account-auditor/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.659969 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/account-reaper/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.823056 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/account-server/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.833677 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/account-replicator/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.858619 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/container-auditor/0.log" Mar 20 18:14:30 crc kubenswrapper[4803]: I0320 18:14:30.908533 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/container-replicator/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.034595 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/container-server/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.035844 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/container-updater/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.122234 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-expirer/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.126753 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-auditor/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.243291 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-replicator/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.258170 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-server/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.345914 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-updater/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.349725 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/rsync/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.497104 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/swift-recon-cron/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.800355 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_6ba5f719-6967-43a1-b544-c27baf20c15b/tempest-tests-tempest-tests-runner/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.820868 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6/test-operator-logs-container/0.log" Mar 20 18:14:31 crc kubenswrapper[4803]: I0320 18:14:31.987121 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w_149f9011-aff3-4d4a-ac40-d1325e1bdad0/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:32 crc kubenswrapper[4803]: I0320 18:14:32.025791 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr_64a2f129-704a-4165-88ee-2cab50cc59d9/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:14:40 crc kubenswrapper[4803]: I0320 18:14:40.776314 4803 scope.go:117] "RemoveContainer" containerID="ab16407a8decb866d30b5bdae345dca99933946ee4c620731b2b9bf5b8d3d5b0" Mar 20 18:14:41 crc kubenswrapper[4803]: I0320 18:14:41.447401 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cd627740-5358-468a-bb90-21d52992a407/memcached/0.log" Mar 20 18:14:58 crc kubenswrapper[4803]: I0320 18:14:58.222605 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/util/0.log" Mar 20 18:14:58 crc kubenswrapper[4803]: I0320 18:14:58.378830 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/util/0.log" Mar 20 18:14:58 crc kubenswrapper[4803]: I0320 18:14:58.414052 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/pull/0.log" Mar 20 18:14:58 crc kubenswrapper[4803]: I0320 18:14:58.418336 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/pull/0.log" Mar 20 18:14:58 crc kubenswrapper[4803]: I0320 18:14:58.600072 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/extract/0.log" Mar 20 18:14:58 crc kubenswrapper[4803]: I0320 18:14:58.605431 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/pull/0.log" Mar 20 18:14:58 crc kubenswrapper[4803]: I0320 18:14:58.650204 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/util/0.log" Mar 20 18:14:58 crc kubenswrapper[4803]: I0320 18:14:58.845247 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-8gf25_fefd7fa6-fd31-4b28-a7e9-1a4e630070fe/manager/0.log" Mar 20 18:14:59 crc kubenswrapper[4803]: I0320 18:14:59.107328 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-b7sz6_cfe324bc-b8c8-4971-b1d5-ed9df499771f/manager/0.log" Mar 20 18:14:59 crc kubenswrapper[4803]: I0320 18:14:59.159479 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-lxz9k_b3eec30a-a8b9-40b8-a786-16a339efe990/manager/0.log" Mar 20 18:14:59 crc kubenswrapper[4803]: I0320 18:14:59.370003 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-b5q6h_10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5/manager/0.log" Mar 20 18:14:59 crc kubenswrapper[4803]: I0320 18:14:59.583161 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-smx6n_25fa5c58-b9f7-4cd8-b1c4-41c190df40f1/manager/0.log" Mar 20 18:14:59 crc kubenswrapper[4803]: I0320 18:14:59.738083 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-7w5w2_524f49fa-3d73-4089-aa72-cbcfdfbed979/manager/0.log" Mar 20 18:14:59 crc kubenswrapper[4803]: I0320 18:14:59.796224 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-cqwk9_7a5c980d-5ccf-4e9d-9687-3119240ecc15/manager/0.log" Mar 20 18:14:59 crc kubenswrapper[4803]: I0320 18:14:59.860703 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6f8b7f6fdf-qxc8h_0dbcced3-01dd-45bb-8f6f-1733abb4f7db/manager/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.027248 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-k8594_5ea17b5b-3363-4c08-a7e3-52cbb4cb5616/manager/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.039516 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-wgxn7_23dabc24-9851-4232-8f79-56c8615246c7/manager/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.146134 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62"] Mar 20 18:15:00 crc kubenswrapper[4803]: E0320 18:15:00.146510 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe937aa-96cd-4b26-b569-f4962670c7de" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.146542 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe937aa-96cd-4b26-b569-f4962670c7de" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4803]: E0320 18:15:00.146561 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a3e053-dea6-408f-a69b-bb1b84daadaa" containerName="container-00" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.146568 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a3e053-dea6-408f-a69b-bb1b84daadaa" containerName="container-00" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.146765 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a3e053-dea6-408f-a69b-bb1b84daadaa" containerName="container-00" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.146791 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe937aa-96cd-4b26-b569-f4962670c7de" containerName="oc" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.147371 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.150854 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.150871 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.155023 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62"] Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.263091 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-6pfw8_b01812c6-e35c-4699-8b7b-192a425bf0ce/manager/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.311350 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-xwbd2_75cb9425-bd1a-4311-a85f-76eee943e0e9/manager/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.328656 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c3e9b0e-9caa-478c-9dd5-59d00106784d-secret-volume\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.329305 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c3e9b0e-9caa-478c-9dd5-59d00106784d-config-volume\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.329377 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lttm\" (UniqueName: \"kubernetes.io/projected/6c3e9b0e-9caa-478c-9dd5-59d00106784d-kube-api-access-6lttm\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.430998 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c3e9b0e-9caa-478c-9dd5-59d00106784d-secret-volume\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.431436 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c3e9b0e-9caa-478c-9dd5-59d00106784d-config-volume\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.431490 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lttm\" (UniqueName: \"kubernetes.io/projected/6c3e9b0e-9caa-478c-9dd5-59d00106784d-kube-api-access-6lttm\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.432439 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c3e9b0e-9caa-478c-9dd5-59d00106784d-config-volume\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.441127 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c3e9b0e-9caa-478c-9dd5-59d00106784d-secret-volume\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.447004 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lttm\" (UniqueName: \"kubernetes.io/projected/6c3e9b0e-9caa-478c-9dd5-59d00106784d-kube-api-access-6lttm\") pod \"collect-profiles-29567175-qbj62\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.452176 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-9dwg8_ff62aaff-bfdc-400e-b6ee-217356ba1a23/manager/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.485916 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.509644 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-vmmmv_e5270e18-ac4b-4f4d-8a6d-085699034cfe/manager/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.728394 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f5vccbg_002615ba-1b17-467b-a536-7a631c5b434e/manager/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.858193 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b67cc5c9-9mrzt_c7141176-2b0e-4fb1-8c92-a424c769e059/operator/0.log" Mar 20 18:15:00 crc kubenswrapper[4803]: I0320 18:15:00.988220 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62"] Mar 20 18:15:01 crc kubenswrapper[4803]: I0320 18:15:01.163125 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sbnxs_e3b2e374-c317-4d96-80e5-f3e274b31ea8/registry-server/0.log" Mar 20 18:15:01 crc kubenswrapper[4803]: I0320 18:15:01.466326 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-lktjh_b6b15ce6-e69e-42ad-a356-9802f8750db4/manager/0.log" Mar 20 18:15:01 crc kubenswrapper[4803]: I0320 18:15:01.533934 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-22skw_f7f518fa-6e0e-431d-88f4-f835400eec2a/manager/0.log" Mar 20 18:15:01 crc kubenswrapper[4803]: I0320 18:15:01.660585 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" event={"ID":"6c3e9b0e-9caa-478c-9dd5-59d00106784d","Type":"ContainerStarted","Data":"0fa85488e877385bef94f12146cbb5686728765bb73ce8ef30d4afa434888128"} Mar 20 18:15:01 crc kubenswrapper[4803]: I0320 18:15:01.660639 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" event={"ID":"6c3e9b0e-9caa-478c-9dd5-59d00106784d","Type":"ContainerStarted","Data":"28a25d932d051e29aeb3e4a82349b5bce13580db37e46cdfd55d8dac0c42c0f9"} Mar 20 18:15:01 crc kubenswrapper[4803]: I0320 18:15:01.676431 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" podStartSLOduration=1.676417379 podStartE2EDuration="1.676417379s" podCreationTimestamp="2026-03-20 18:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:15:01.675232805 +0000 UTC m=+3511.586824875" watchObservedRunningTime="2026-03-20 18:15:01.676417379 +0000 UTC m=+3511.588009449" Mar 20 18:15:01 crc kubenswrapper[4803]: I0320 18:15:01.792521 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-sh6jl_6616ca24-82bf-405a-b77b-8617c65ec76b/manager/0.log" Mar 20 18:15:01 crc kubenswrapper[4803]: I0320 18:15:01.972263 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-fgckh_76713a0c-ed94-4e45-a947-e05e3ef0d3d6/manager/0.log" Mar 20 18:15:02 crc kubenswrapper[4803]: I0320 18:15:02.002070 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56f44579c8-sht64_36ca4d28-1feb-4c48-bba8-d078f85fc37f/manager/0.log" Mar 20 18:15:02 crc kubenswrapper[4803]: I0320 18:15:02.051773 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-b7znw_6409d065-33f9-4e12-806a-b805e4b6e0ea/manager/0.log" Mar 20 18:15:02 crc kubenswrapper[4803]: I0320 18:15:02.198137 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-dxp5d_a8a0f2f5-7910-44c7-969d-204a7d1327d9/manager/0.log" Mar 20 18:15:02 crc kubenswrapper[4803]: I0320 18:15:02.670634 4803 generic.go:334] "Generic (PLEG): container finished" podID="6c3e9b0e-9caa-478c-9dd5-59d00106784d" containerID="0fa85488e877385bef94f12146cbb5686728765bb73ce8ef30d4afa434888128" exitCode=0 Mar 20 18:15:02 crc kubenswrapper[4803]: I0320 18:15:02.670744 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" event={"ID":"6c3e9b0e-9caa-478c-9dd5-59d00106784d","Type":"ContainerDied","Data":"0fa85488e877385bef94f12146cbb5686728765bb73ce8ef30d4afa434888128"} Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.086671 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.107788 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c3e9b0e-9caa-478c-9dd5-59d00106784d-config-volume\") pod \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.107956 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lttm\" (UniqueName: \"kubernetes.io/projected/6c3e9b0e-9caa-478c-9dd5-59d00106784d-kube-api-access-6lttm\") pod \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.108039 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c3e9b0e-9caa-478c-9dd5-59d00106784d-secret-volume\") pod \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\" (UID: \"6c3e9b0e-9caa-478c-9dd5-59d00106784d\") " Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.108512 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c3e9b0e-9caa-478c-9dd5-59d00106784d-config-volume" (OuterVolumeSpecName: "config-volume") pod "6c3e9b0e-9caa-478c-9dd5-59d00106784d" (UID: "6c3e9b0e-9caa-478c-9dd5-59d00106784d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.108881 4803 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6c3e9b0e-9caa-478c-9dd5-59d00106784d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.113508 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c3e9b0e-9caa-478c-9dd5-59d00106784d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6c3e9b0e-9caa-478c-9dd5-59d00106784d" (UID: "6c3e9b0e-9caa-478c-9dd5-59d00106784d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.114285 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c3e9b0e-9caa-478c-9dd5-59d00106784d-kube-api-access-6lttm" (OuterVolumeSpecName: "kube-api-access-6lttm") pod "6c3e9b0e-9caa-478c-9dd5-59d00106784d" (UID: "6c3e9b0e-9caa-478c-9dd5-59d00106784d"). InnerVolumeSpecName "kube-api-access-6lttm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.210300 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lttm\" (UniqueName: \"kubernetes.io/projected/6c3e9b0e-9caa-478c-9dd5-59d00106784d-kube-api-access-6lttm\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.210342 4803 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6c3e9b0e-9caa-478c-9dd5-59d00106784d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.718436 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" event={"ID":"6c3e9b0e-9caa-478c-9dd5-59d00106784d","Type":"ContainerDied","Data":"28a25d932d051e29aeb3e4a82349b5bce13580db37e46cdfd55d8dac0c42c0f9"} Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.718473 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28a25d932d051e29aeb3e4a82349b5bce13580db37e46cdfd55d8dac0c42c0f9" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.718545 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567175-qbj62" Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.762088 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2"] Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.773961 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567130-vlcc2"] Mar 20 18:15:04 crc kubenswrapper[4803]: I0320 18:15:04.859316 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df2e7069-f589-4f29-b6aa-c4622603e334" path="/var/lib/kubelet/pods/df2e7069-f589-4f29-b6aa-c4622603e334/volumes" Mar 20 18:15:20 crc kubenswrapper[4803]: I0320 18:15:20.777538 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-k5nvk_f13296fc-7b19-43e5-9f80-08502dee6f1b/control-plane-machine-set-operator/0.log" Mar 20 18:15:20 crc kubenswrapper[4803]: I0320 18:15:20.913059 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ph6zk_f3f47d35-b096-47cb-879d-05004b9cbcf4/kube-rbac-proxy/0.log" Mar 20 18:15:20 crc kubenswrapper[4803]: I0320 18:15:20.939882 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ph6zk_f3f47d35-b096-47cb-879d-05004b9cbcf4/machine-api-operator/0.log" Mar 20 18:15:33 crc kubenswrapper[4803]: I0320 18:15:33.489154 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-dt62l_06da84ce-5bd4-4e75-ac2d-eda831724e58/cert-manager-controller/0.log" Mar 20 18:15:33 crc kubenswrapper[4803]: I0320 18:15:33.647248 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-wfvp8_4d10f699-d096-4cc0-bd7d-1afc806ede10/cert-manager-cainjector/0.log" Mar 20 18:15:33 crc kubenswrapper[4803]: I0320 18:15:33.682571 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-gq2dl_ea5d9398-4412-4a8c-a015-42ec0733de0c/cert-manager-webhook/0.log" Mar 20 18:15:40 crc kubenswrapper[4803]: I0320 18:15:40.915768 4803 scope.go:117] "RemoveContainer" containerID="83900e21ba5143ee4fa6705d4b516b5e041200bd9912731f132257c91f5830c3" Mar 20 18:15:46 crc kubenswrapper[4803]: I0320 18:15:46.302093 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-2nnmb_38d598c3-b9e5-4404-abb4-da1e9354e157/nmstate-console-plugin/0.log" Mar 20 18:15:46 crc kubenswrapper[4803]: I0320 18:15:46.421380 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-stkpl_7130990e-c3d3-48fc-99a3-31f225ec19ee/nmstate-handler/0.log" Mar 20 18:15:46 crc kubenswrapper[4803]: I0320 18:15:46.479195 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-fjk5k_8b0bc609-411b-43cc-b7cf-a88f669b2d44/kube-rbac-proxy/0.log" Mar 20 18:15:46 crc kubenswrapper[4803]: I0320 18:15:46.507759 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-fjk5k_8b0bc609-411b-43cc-b7cf-a88f669b2d44/nmstate-metrics/0.log" Mar 20 18:15:46 crc kubenswrapper[4803]: I0320 18:15:46.640836 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-xgfwl_f2fb97b5-40f7-443d-8680-95e112804031/nmstate-operator/0.log" Mar 20 18:15:46 crc kubenswrapper[4803]: I0320 18:15:46.700157 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-krj4f_0bf484cc-0fe2-4cb0-99c2-0714910012ca/nmstate-webhook/0.log" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.141264 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567176-8vdc6"] Mar 20 18:16:00 crc kubenswrapper[4803]: E0320 18:16:00.142224 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c3e9b0e-9caa-478c-9dd5-59d00106784d" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.142238 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c3e9b0e-9caa-478c-9dd5-59d00106784d" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.142437 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c3e9b0e-9caa-478c-9dd5-59d00106784d" containerName="collect-profiles" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.143067 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.147214 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.147429 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.147659 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.152553 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-8vdc6"] Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.283631 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w7w6\" (UniqueName: \"kubernetes.io/projected/a7fbddbf-b60d-44c9-adc8-53b24104e875-kube-api-access-5w7w6\") pod \"auto-csr-approver-29567176-8vdc6\" (UID: \"a7fbddbf-b60d-44c9-adc8-53b24104e875\") " pod="openshift-infra/auto-csr-approver-29567176-8vdc6" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.385536 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w7w6\" (UniqueName: \"kubernetes.io/projected/a7fbddbf-b60d-44c9-adc8-53b24104e875-kube-api-access-5w7w6\") pod \"auto-csr-approver-29567176-8vdc6\" (UID: \"a7fbddbf-b60d-44c9-adc8-53b24104e875\") " pod="openshift-infra/auto-csr-approver-29567176-8vdc6" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.405344 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w7w6\" (UniqueName: \"kubernetes.io/projected/a7fbddbf-b60d-44c9-adc8-53b24104e875-kube-api-access-5w7w6\") pod \"auto-csr-approver-29567176-8vdc6\" (UID: \"a7fbddbf-b60d-44c9-adc8-53b24104e875\") " pod="openshift-infra/auto-csr-approver-29567176-8vdc6" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.502801 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.964660 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-8vdc6"] Mar 20 18:16:00 crc kubenswrapper[4803]: I0320 18:16:00.979758 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:16:01 crc kubenswrapper[4803]: I0320 18:16:01.237330 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" event={"ID":"a7fbddbf-b60d-44c9-adc8-53b24104e875","Type":"ContainerStarted","Data":"61f4adbdc980c7e5e8472ff5337e0b83a634c1f62060001ddb98e4d64c7bdcf1"} Mar 20 18:16:02 crc kubenswrapper[4803]: I0320 18:16:02.246487 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" event={"ID":"a7fbddbf-b60d-44c9-adc8-53b24104e875","Type":"ContainerStarted","Data":"7da7ca8dd32d79d4ff16f10cb2c16e28be00b8f729a66a804391ed4bbc74cec7"} Mar 20 18:16:02 crc kubenswrapper[4803]: I0320 18:16:02.263163 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" podStartSLOduration=1.328197694 podStartE2EDuration="2.263142793s" podCreationTimestamp="2026-03-20 18:16:00 +0000 UTC" firstStartedPulling="2026-03-20 18:16:00.979513823 +0000 UTC m=+3570.891105883" lastFinishedPulling="2026-03-20 18:16:01.914458912 +0000 UTC m=+3571.826050982" observedRunningTime="2026-03-20 18:16:02.260013505 +0000 UTC m=+3572.171605585" watchObservedRunningTime="2026-03-20 18:16:02.263142793 +0000 UTC m=+3572.174734863" Mar 20 18:16:03 crc kubenswrapper[4803]: I0320 18:16:03.255808 4803 generic.go:334] "Generic (PLEG): container finished" podID="a7fbddbf-b60d-44c9-adc8-53b24104e875" containerID="7da7ca8dd32d79d4ff16f10cb2c16e28be00b8f729a66a804391ed4bbc74cec7" exitCode=0 Mar 20 18:16:03 crc kubenswrapper[4803]: I0320 18:16:03.255907 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" event={"ID":"a7fbddbf-b60d-44c9-adc8-53b24104e875","Type":"ContainerDied","Data":"7da7ca8dd32d79d4ff16f10cb2c16e28be00b8f729a66a804391ed4bbc74cec7"} Mar 20 18:16:04 crc kubenswrapper[4803]: I0320 18:16:04.639749 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" Mar 20 18:16:04 crc kubenswrapper[4803]: I0320 18:16:04.786346 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w7w6\" (UniqueName: \"kubernetes.io/projected/a7fbddbf-b60d-44c9-adc8-53b24104e875-kube-api-access-5w7w6\") pod \"a7fbddbf-b60d-44c9-adc8-53b24104e875\" (UID: \"a7fbddbf-b60d-44c9-adc8-53b24104e875\") " Mar 20 18:16:04 crc kubenswrapper[4803]: I0320 18:16:04.794037 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fbddbf-b60d-44c9-adc8-53b24104e875-kube-api-access-5w7w6" (OuterVolumeSpecName: "kube-api-access-5w7w6") pod "a7fbddbf-b60d-44c9-adc8-53b24104e875" (UID: "a7fbddbf-b60d-44c9-adc8-53b24104e875"). InnerVolumeSpecName "kube-api-access-5w7w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:16:04 crc kubenswrapper[4803]: I0320 18:16:04.889342 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w7w6\" (UniqueName: \"kubernetes.io/projected/a7fbddbf-b60d-44c9-adc8-53b24104e875-kube-api-access-5w7w6\") on node \"crc\" DevicePath \"\"" Mar 20 18:16:05 crc kubenswrapper[4803]: I0320 18:16:05.275680 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" event={"ID":"a7fbddbf-b60d-44c9-adc8-53b24104e875","Type":"ContainerDied","Data":"61f4adbdc980c7e5e8472ff5337e0b83a634c1f62060001ddb98e4d64c7bdcf1"} Mar 20 18:16:05 crc kubenswrapper[4803]: I0320 18:16:05.275728 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61f4adbdc980c7e5e8472ff5337e0b83a634c1f62060001ddb98e4d64c7bdcf1" Mar 20 18:16:05 crc kubenswrapper[4803]: I0320 18:16:05.275709 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567176-8vdc6" Mar 20 18:16:05 crc kubenswrapper[4803]: I0320 18:16:05.340341 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-7dc7n"] Mar 20 18:16:05 crc kubenswrapper[4803]: I0320 18:16:05.351650 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567170-7dc7n"] Mar 20 18:16:06 crc kubenswrapper[4803]: I0320 18:16:06.859054 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e595cb0d-75b2-4615-9658-eaefa75ce503" path="/var/lib/kubelet/pods/e595cb0d-75b2-4615-9658-eaefa75ce503/volumes" Mar 20 18:16:08 crc kubenswrapper[4803]: I0320 18:16:08.245769 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:16:08 crc kubenswrapper[4803]: I0320 18:16:08.246014 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:16:13 crc kubenswrapper[4803]: I0320 18:16:13.943748 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-24x6b_06c64fc0-e716-455b-bff4-0aac055505a9/kube-rbac-proxy/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.157474 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-24x6b_06c64fc0-e716-455b-bff4-0aac055505a9/controller/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.163112 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-6nwbj_3181265c-ba7e-4f29-9950-bfefd81e98e5/frr-k8s-webhook-server/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.315221 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-frr-files/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.476292 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-frr-files/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.507736 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-metrics/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.537649 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-reloader/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.537860 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-reloader/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.731328 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-frr-files/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.747599 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-metrics/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.750548 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-reloader/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.777048 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-metrics/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.967254 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-metrics/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.983040 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-reloader/0.log" Mar 20 18:16:14 crc kubenswrapper[4803]: I0320 18:16:14.997413 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-frr-files/0.log" Mar 20 18:16:15 crc kubenswrapper[4803]: I0320 18:16:15.005856 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/controller/0.log" Mar 20 18:16:15 crc kubenswrapper[4803]: I0320 18:16:15.160685 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/frr-metrics/0.log" Mar 20 18:16:15 crc kubenswrapper[4803]: I0320 18:16:15.202339 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/kube-rbac-proxy/0.log" Mar 20 18:16:15 crc kubenswrapper[4803]: I0320 18:16:15.221198 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/kube-rbac-proxy-frr/0.log" Mar 20 18:16:15 crc kubenswrapper[4803]: I0320 18:16:15.365692 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/reloader/0.log" Mar 20 18:16:15 crc kubenswrapper[4803]: I0320 18:16:15.540449 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-599d9f9c9-jbh6h_656f1985-be0a-4447-a03f-2ec4d11727c2/manager/0.log" Mar 20 18:16:15 crc kubenswrapper[4803]: I0320 18:16:15.675562 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66cfbc7d76-5tcqn_f939090a-abec-48ac-9e06-10175ff02c71/webhook-server/0.log" Mar 20 18:16:15 crc kubenswrapper[4803]: I0320 18:16:15.790090 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qjwcn_6b632859-081b-4be0-a3f6-9b91b4687ecf/kube-rbac-proxy/0.log" Mar 20 18:16:16 crc kubenswrapper[4803]: I0320 18:16:16.481566 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qjwcn_6b632859-081b-4be0-a3f6-9b91b4687ecf/speaker/0.log" Mar 20 18:16:16 crc kubenswrapper[4803]: I0320 18:16:16.890203 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/frr/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.014970 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/util/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.231164 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/pull/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.231205 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/pull/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.241393 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/util/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.418188 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/extract/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.425834 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/util/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.443242 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/pull/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.623484 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/util/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.856913 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/pull/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.886538 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/util/0.log" Mar 20 18:16:29 crc kubenswrapper[4803]: I0320 18:16:29.909789 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/pull/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.059841 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/pull/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.066508 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/util/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.102877 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/extract/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.251010 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-utilities/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.425360 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-utilities/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.435471 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-content/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.484232 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-content/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.642446 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-content/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.650611 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-utilities/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.827692 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-utilities/0.log" Mar 20 18:16:30 crc kubenswrapper[4803]: I0320 18:16:30.882656 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/registry-server/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.135232 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-utilities/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.151808 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-content/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.171554 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-content/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.361501 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-content/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.422944 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-utilities/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.560203 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-s92b2_617ba1c8-b42d-4e9c-8d5b-6a903f267358/marketplace-operator/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.661923 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-utilities/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.852131 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/registry-server/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.908876 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-utilities/0.log" Mar 20 18:16:31 crc kubenswrapper[4803]: I0320 18:16:31.977823 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-content/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.008795 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-content/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.252580 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-utilities/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.284485 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-content/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.453844 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/registry-server/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.488906 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-utilities/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.580740 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-utilities/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.632767 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-content/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.643800 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-content/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.786365 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-content/0.log" Mar 20 18:16:32 crc kubenswrapper[4803]: I0320 18:16:32.810203 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-utilities/0.log" Mar 20 18:16:33 crc kubenswrapper[4803]: I0320 18:16:33.178515 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/registry-server/0.log" Mar 20 18:16:38 crc kubenswrapper[4803]: I0320 18:16:38.245670 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:16:38 crc kubenswrapper[4803]: I0320 18:16:38.246371 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:16:40 crc kubenswrapper[4803]: I0320 18:16:40.977760 4803 scope.go:117] "RemoveContainer" containerID="b340f385277c882a25ec48ff66272ec25d7b577e7ce8c38b5f15a671480dcc1a" Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.246382 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.247076 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.247136 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.248186 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.248277 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" gracePeriod=600 Mar 20 18:17:08 crc kubenswrapper[4803]: E0320 18:17:08.371619 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.858591 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" exitCode=0 Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.905210 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f"} Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.905308 4803 scope.go:117] "RemoveContainer" containerID="2f3cc3c4717154afbdb4abc9029591c46f46936ced43a499885d3ba9d07fcb2b" Mar 20 18:17:08 crc kubenswrapper[4803]: I0320 18:17:08.906448 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:17:08 crc kubenswrapper[4803]: E0320 18:17:08.906976 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:17:23 crc kubenswrapper[4803]: I0320 18:17:23.848175 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:17:23 crc kubenswrapper[4803]: E0320 18:17:23.849634 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:17:35 crc kubenswrapper[4803]: I0320 18:17:35.848650 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:17:35 crc kubenswrapper[4803]: E0320 18:17:35.849473 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:17:50 crc kubenswrapper[4803]: I0320 18:17:50.862077 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:17:50 crc kubenswrapper[4803]: E0320 18:17:50.862830 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.155094 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567178-8pkkx"] Mar 20 18:18:00 crc kubenswrapper[4803]: E0320 18:18:00.156546 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fbddbf-b60d-44c9-adc8-53b24104e875" containerName="oc" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.156571 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fbddbf-b60d-44c9-adc8-53b24104e875" containerName="oc" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.156975 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fbddbf-b60d-44c9-adc8-53b24104e875" containerName="oc" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.158144 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-8pkkx" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.160305 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.161086 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.161111 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.166858 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-8pkkx"] Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.227003 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svsd8\" (UniqueName: \"kubernetes.io/projected/dd207e9b-7b34-48bd-9f9d-5201793fa6f3-kube-api-access-svsd8\") pod \"auto-csr-approver-29567178-8pkkx\" (UID: \"dd207e9b-7b34-48bd-9f9d-5201793fa6f3\") " pod="openshift-infra/auto-csr-approver-29567178-8pkkx" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.329265 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svsd8\" (UniqueName: \"kubernetes.io/projected/dd207e9b-7b34-48bd-9f9d-5201793fa6f3-kube-api-access-svsd8\") pod \"auto-csr-approver-29567178-8pkkx\" (UID: \"dd207e9b-7b34-48bd-9f9d-5201793fa6f3\") " pod="openshift-infra/auto-csr-approver-29567178-8pkkx" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.350375 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svsd8\" (UniqueName: \"kubernetes.io/projected/dd207e9b-7b34-48bd-9f9d-5201793fa6f3-kube-api-access-svsd8\") pod \"auto-csr-approver-29567178-8pkkx\" (UID: \"dd207e9b-7b34-48bd-9f9d-5201793fa6f3\") " pod="openshift-infra/auto-csr-approver-29567178-8pkkx" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.476469 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-8pkkx" Mar 20 18:18:00 crc kubenswrapper[4803]: I0320 18:18:00.949690 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-8pkkx"] Mar 20 18:18:01 crc kubenswrapper[4803]: I0320 18:18:01.433513 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-8pkkx" event={"ID":"dd207e9b-7b34-48bd-9f9d-5201793fa6f3","Type":"ContainerStarted","Data":"d1323364b236271259f64d8c697db48dcf379e1ef2f34d5e74a7357fd1fed7e0"} Mar 20 18:18:02 crc kubenswrapper[4803]: I0320 18:18:02.445208 4803 generic.go:334] "Generic (PLEG): container finished" podID="dd207e9b-7b34-48bd-9f9d-5201793fa6f3" containerID="02d1d84476808c7878973b7c431017cb412506e64972747ba7fc5e325378c7f0" exitCode=0 Mar 20 18:18:02 crc kubenswrapper[4803]: I0320 18:18:02.445425 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-8pkkx" event={"ID":"dd207e9b-7b34-48bd-9f9d-5201793fa6f3","Type":"ContainerDied","Data":"02d1d84476808c7878973b7c431017cb412506e64972747ba7fc5e325378c7f0"} Mar 20 18:18:02 crc kubenswrapper[4803]: I0320 18:18:02.849037 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:18:02 crc kubenswrapper[4803]: E0320 18:18:02.849401 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:18:03 crc kubenswrapper[4803]: I0320 18:18:03.857623 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-8pkkx" Mar 20 18:18:04 crc kubenswrapper[4803]: I0320 18:18:04.007793 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svsd8\" (UniqueName: \"kubernetes.io/projected/dd207e9b-7b34-48bd-9f9d-5201793fa6f3-kube-api-access-svsd8\") pod \"dd207e9b-7b34-48bd-9f9d-5201793fa6f3\" (UID: \"dd207e9b-7b34-48bd-9f9d-5201793fa6f3\") " Mar 20 18:18:04 crc kubenswrapper[4803]: I0320 18:18:04.014155 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd207e9b-7b34-48bd-9f9d-5201793fa6f3-kube-api-access-svsd8" (OuterVolumeSpecName: "kube-api-access-svsd8") pod "dd207e9b-7b34-48bd-9f9d-5201793fa6f3" (UID: "dd207e9b-7b34-48bd-9f9d-5201793fa6f3"). InnerVolumeSpecName "kube-api-access-svsd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:04 crc kubenswrapper[4803]: I0320 18:18:04.110310 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svsd8\" (UniqueName: \"kubernetes.io/projected/dd207e9b-7b34-48bd-9f9d-5201793fa6f3-kube-api-access-svsd8\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:04 crc kubenswrapper[4803]: I0320 18:18:04.465459 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567178-8pkkx" event={"ID":"dd207e9b-7b34-48bd-9f9d-5201793fa6f3","Type":"ContainerDied","Data":"d1323364b236271259f64d8c697db48dcf379e1ef2f34d5e74a7357fd1fed7e0"} Mar 20 18:18:04 crc kubenswrapper[4803]: I0320 18:18:04.465510 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1323364b236271259f64d8c697db48dcf379e1ef2f34d5e74a7357fd1fed7e0" Mar 20 18:18:04 crc kubenswrapper[4803]: I0320 18:18:04.465597 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567178-8pkkx" Mar 20 18:18:04 crc kubenswrapper[4803]: I0320 18:18:04.931228 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-m6pk2"] Mar 20 18:18:04 crc kubenswrapper[4803]: I0320 18:18:04.940114 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567172-m6pk2"] Mar 20 18:18:06 crc kubenswrapper[4803]: I0320 18:18:06.862052 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d59f1b-69b7-4c69-98f0-58f85f1125c0" path="/var/lib/kubelet/pods/e5d59f1b-69b7-4c69-98f0-58f85f1125c0/volumes" Mar 20 18:18:14 crc kubenswrapper[4803]: I0320 18:18:14.853914 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:18:14 crc kubenswrapper[4803]: E0320 18:18:14.855273 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:18:23 crc kubenswrapper[4803]: I0320 18:18:23.634266 4803 generic.go:334] "Generic (PLEG): container finished" podID="d2557091-e342-4378-8dd3-3355ede65628" containerID="52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7" exitCode=0 Mar 20 18:18:23 crc kubenswrapper[4803]: I0320 18:18:23.634328 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" event={"ID":"d2557091-e342-4378-8dd3-3355ede65628","Type":"ContainerDied","Data":"52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7"} Mar 20 18:18:23 crc kubenswrapper[4803]: I0320 18:18:23.635328 4803 scope.go:117] "RemoveContainer" containerID="52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7" Mar 20 18:18:24 crc kubenswrapper[4803]: I0320 18:18:24.689356 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vdxlg_must-gather-gtjhq_d2557091-e342-4378-8dd3-3355ede65628/gather/0.log" Mar 20 18:18:28 crc kubenswrapper[4803]: I0320 18:18:28.848669 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:18:28 crc kubenswrapper[4803]: E0320 18:18:28.849515 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:18:31 crc kubenswrapper[4803]: I0320 18:18:31.857212 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vdxlg/must-gather-gtjhq"] Mar 20 18:18:31 crc kubenswrapper[4803]: I0320 18:18:31.857773 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" podUID="d2557091-e342-4378-8dd3-3355ede65628" containerName="copy" containerID="cri-o://e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821" gracePeriod=2 Mar 20 18:18:31 crc kubenswrapper[4803]: I0320 18:18:31.869390 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vdxlg/must-gather-gtjhq"] Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.293006 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vdxlg_must-gather-gtjhq_d2557091-e342-4378-8dd3-3355ede65628/copy/0.log" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.293621 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.386850 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt24n\" (UniqueName: \"kubernetes.io/projected/d2557091-e342-4378-8dd3-3355ede65628-kube-api-access-jt24n\") pod \"d2557091-e342-4378-8dd3-3355ede65628\" (UID: \"d2557091-e342-4378-8dd3-3355ede65628\") " Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.386922 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2557091-e342-4378-8dd3-3355ede65628-must-gather-output\") pod \"d2557091-e342-4378-8dd3-3355ede65628\" (UID: \"d2557091-e342-4378-8dd3-3355ede65628\") " Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.392933 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2557091-e342-4378-8dd3-3355ede65628-kube-api-access-jt24n" (OuterVolumeSpecName: "kube-api-access-jt24n") pod "d2557091-e342-4378-8dd3-3355ede65628" (UID: "d2557091-e342-4378-8dd3-3355ede65628"). InnerVolumeSpecName "kube-api-access-jt24n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.489487 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt24n\" (UniqueName: \"kubernetes.io/projected/d2557091-e342-4378-8dd3-3355ede65628-kube-api-access-jt24n\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.534387 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2557091-e342-4378-8dd3-3355ede65628-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d2557091-e342-4378-8dd3-3355ede65628" (UID: "d2557091-e342-4378-8dd3-3355ede65628"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.591687 4803 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2557091-e342-4378-8dd3-3355ede65628-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.734659 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vdxlg_must-gather-gtjhq_d2557091-e342-4378-8dd3-3355ede65628/copy/0.log" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.735386 4803 generic.go:334] "Generic (PLEG): container finished" podID="d2557091-e342-4378-8dd3-3355ede65628" containerID="e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821" exitCode=143 Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.735446 4803 scope.go:117] "RemoveContainer" containerID="e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.735587 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vdxlg/must-gather-gtjhq" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.760175 4803 scope.go:117] "RemoveContainer" containerID="52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.833555 4803 scope.go:117] "RemoveContainer" containerID="e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821" Mar 20 18:18:32 crc kubenswrapper[4803]: E0320 18:18:32.834115 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821\": container with ID starting with e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821 not found: ID does not exist" containerID="e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.834149 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821"} err="failed to get container status \"e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821\": rpc error: code = NotFound desc = could not find container \"e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821\": container with ID starting with e834071f1838f72610c21eaede05714cd5703d0afd776bebacdb3d93a7f6d821 not found: ID does not exist" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.834170 4803 scope.go:117] "RemoveContainer" containerID="52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7" Mar 20 18:18:32 crc kubenswrapper[4803]: E0320 18:18:32.834537 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7\": container with ID starting with 52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7 not found: ID does not exist" containerID="52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.834569 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7"} err="failed to get container status \"52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7\": rpc error: code = NotFound desc = could not find container \"52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7\": container with ID starting with 52348d4cfd355c2f1269a04c612fcb9f110ecec912c41d6207630ee8e77e65a7 not found: ID does not exist" Mar 20 18:18:32 crc kubenswrapper[4803]: I0320 18:18:32.857861 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2557091-e342-4378-8dd3-3355ede65628" path="/var/lib/kubelet/pods/d2557091-e342-4378-8dd3-3355ede65628/volumes" Mar 20 18:18:41 crc kubenswrapper[4803]: I0320 18:18:41.076905 4803 scope.go:117] "RemoveContainer" containerID="90af2e1ee7c5b490ff7c8bde587126b1d1bc681c20fae320b213c82791297e3e" Mar 20 18:18:42 crc kubenswrapper[4803]: I0320 18:18:42.848794 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:18:42 crc kubenswrapper[4803]: E0320 18:18:42.849499 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:18:55 crc kubenswrapper[4803]: I0320 18:18:55.847700 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:18:55 crc kubenswrapper[4803]: E0320 18:18:55.848569 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:19:08 crc kubenswrapper[4803]: I0320 18:19:08.565213 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-5598996667-w8j76" podUID="8b611951-3d24-49d2-a5bc-18d41e478610" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 20 18:19:09 crc kubenswrapper[4803]: I0320 18:19:09.848163 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:19:09 crc kubenswrapper[4803]: E0320 18:19:09.848567 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:19:23 crc kubenswrapper[4803]: I0320 18:19:23.849165 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:19:23 crc kubenswrapper[4803]: E0320 18:19:23.850225 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:19:34 crc kubenswrapper[4803]: I0320 18:19:34.847817 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:19:34 crc kubenswrapper[4803]: E0320 18:19:34.849561 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:19:49 crc kubenswrapper[4803]: I0320 18:19:49.847938 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:19:49 crc kubenswrapper[4803]: E0320 18:19:49.848701 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.178433 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567180-l5d75"] Mar 20 18:20:00 crc kubenswrapper[4803]: E0320 18:20:00.179747 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd207e9b-7b34-48bd-9f9d-5201793fa6f3" containerName="oc" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.179769 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd207e9b-7b34-48bd-9f9d-5201793fa6f3" containerName="oc" Mar 20 18:20:00 crc kubenswrapper[4803]: E0320 18:20:00.179796 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2557091-e342-4378-8dd3-3355ede65628" containerName="gather" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.179809 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2557091-e342-4378-8dd3-3355ede65628" containerName="gather" Mar 20 18:20:00 crc kubenswrapper[4803]: E0320 18:20:00.179858 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2557091-e342-4378-8dd3-3355ede65628" containerName="copy" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.179870 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2557091-e342-4378-8dd3-3355ede65628" containerName="copy" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.180167 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd207e9b-7b34-48bd-9f9d-5201793fa6f3" containerName="oc" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.180193 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2557091-e342-4378-8dd3-3355ede65628" containerName="copy" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.180243 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2557091-e342-4378-8dd3-3355ede65628" containerName="gather" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.181160 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-l5d75" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.185837 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.185978 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.185998 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.189700 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-l5d75"] Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.239735 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh987\" (UniqueName: \"kubernetes.io/projected/8217899f-3d1a-40be-871d-2315f7b6878b-kube-api-access-nh987\") pod \"auto-csr-approver-29567180-l5d75\" (UID: \"8217899f-3d1a-40be-871d-2315f7b6878b\") " pod="openshift-infra/auto-csr-approver-29567180-l5d75" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.341861 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh987\" (UniqueName: \"kubernetes.io/projected/8217899f-3d1a-40be-871d-2315f7b6878b-kube-api-access-nh987\") pod \"auto-csr-approver-29567180-l5d75\" (UID: \"8217899f-3d1a-40be-871d-2315f7b6878b\") " pod="openshift-infra/auto-csr-approver-29567180-l5d75" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.360615 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh987\" (UniqueName: \"kubernetes.io/projected/8217899f-3d1a-40be-871d-2315f7b6878b-kube-api-access-nh987\") pod \"auto-csr-approver-29567180-l5d75\" (UID: \"8217899f-3d1a-40be-871d-2315f7b6878b\") " pod="openshift-infra/auto-csr-approver-29567180-l5d75" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.521404 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-l5d75" Mar 20 18:20:00 crc kubenswrapper[4803]: I0320 18:20:00.974043 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-l5d75"] Mar 20 18:20:01 crc kubenswrapper[4803]: I0320 18:20:01.615043 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-l5d75" event={"ID":"8217899f-3d1a-40be-871d-2315f7b6878b","Type":"ContainerStarted","Data":"19455324a791777d00df3e9b8c6e5f910d418797d5d8ae0ceb846b5c30c2e2aa"} Mar 20 18:20:03 crc kubenswrapper[4803]: I0320 18:20:03.642104 4803 generic.go:334] "Generic (PLEG): container finished" podID="8217899f-3d1a-40be-871d-2315f7b6878b" containerID="13e0d63cc3019f0660afec642b53963b8ac60a69c3f6a66f8b77f7d421e4aad4" exitCode=0 Mar 20 18:20:03 crc kubenswrapper[4803]: I0320 18:20:03.642214 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-l5d75" event={"ID":"8217899f-3d1a-40be-871d-2315f7b6878b","Type":"ContainerDied","Data":"13e0d63cc3019f0660afec642b53963b8ac60a69c3f6a66f8b77f7d421e4aad4"} Mar 20 18:20:04 crc kubenswrapper[4803]: I0320 18:20:04.852280 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:20:04 crc kubenswrapper[4803]: E0320 18:20:04.869187 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:20:05 crc kubenswrapper[4803]: I0320 18:20:05.091157 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-l5d75" Mar 20 18:20:05 crc kubenswrapper[4803]: I0320 18:20:05.141258 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh987\" (UniqueName: \"kubernetes.io/projected/8217899f-3d1a-40be-871d-2315f7b6878b-kube-api-access-nh987\") pod \"8217899f-3d1a-40be-871d-2315f7b6878b\" (UID: \"8217899f-3d1a-40be-871d-2315f7b6878b\") " Mar 20 18:20:05 crc kubenswrapper[4803]: I0320 18:20:05.147792 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8217899f-3d1a-40be-871d-2315f7b6878b-kube-api-access-nh987" (OuterVolumeSpecName: "kube-api-access-nh987") pod "8217899f-3d1a-40be-871d-2315f7b6878b" (UID: "8217899f-3d1a-40be-871d-2315f7b6878b"). InnerVolumeSpecName "kube-api-access-nh987". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:20:05 crc kubenswrapper[4803]: I0320 18:20:05.242932 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh987\" (UniqueName: \"kubernetes.io/projected/8217899f-3d1a-40be-871d-2315f7b6878b-kube-api-access-nh987\") on node \"crc\" DevicePath \"\"" Mar 20 18:20:05 crc kubenswrapper[4803]: I0320 18:20:05.661435 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567180-l5d75" event={"ID":"8217899f-3d1a-40be-871d-2315f7b6878b","Type":"ContainerDied","Data":"19455324a791777d00df3e9b8c6e5f910d418797d5d8ae0ceb846b5c30c2e2aa"} Mar 20 18:20:05 crc kubenswrapper[4803]: I0320 18:20:05.661864 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19455324a791777d00df3e9b8c6e5f910d418797d5d8ae0ceb846b5c30c2e2aa" Mar 20 18:20:05 crc kubenswrapper[4803]: I0320 18:20:05.661813 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567180-l5d75" Mar 20 18:20:05 crc kubenswrapper[4803]: E0320 18:20:05.776928 4803 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8217899f_3d1a_40be_871d_2315f7b6878b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8217899f_3d1a_40be_871d_2315f7b6878b.slice/crio-19455324a791777d00df3e9b8c6e5f910d418797d5d8ae0ceb846b5c30c2e2aa\": RecentStats: unable to find data in memory cache]" Mar 20 18:20:06 crc kubenswrapper[4803]: I0320 18:20:06.168684 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-2h6sj"] Mar 20 18:20:06 crc kubenswrapper[4803]: I0320 18:20:06.178651 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567174-2h6sj"] Mar 20 18:20:06 crc kubenswrapper[4803]: I0320 18:20:06.864038 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe937aa-96cd-4b26-b569-f4962670c7de" path="/var/lib/kubelet/pods/5fe937aa-96cd-4b26-b569-f4962670c7de/volumes" Mar 20 18:20:17 crc kubenswrapper[4803]: I0320 18:20:17.848578 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:20:17 crc kubenswrapper[4803]: E0320 18:20:17.849384 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:20:30 crc kubenswrapper[4803]: I0320 18:20:30.860456 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:20:30 crc kubenswrapper[4803]: E0320 18:20:30.861552 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:20:41 crc kubenswrapper[4803]: I0320 18:20:41.201152 4803 scope.go:117] "RemoveContainer" containerID="6a6e5bdb0913725740d8118be8e75cbe27b4a489d895a0dc72b3efbcdfecbb44" Mar 20 18:20:43 crc kubenswrapper[4803]: I0320 18:20:43.848768 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:20:43 crc kubenswrapper[4803]: E0320 18:20:43.849628 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:20:54 crc kubenswrapper[4803]: I0320 18:20:54.847867 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:20:54 crc kubenswrapper[4803]: E0320 18:20:54.848464 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:21:09 crc kubenswrapper[4803]: I0320 18:21:09.849128 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:21:09 crc kubenswrapper[4803]: E0320 18:21:09.849910 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.441081 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sclvs/must-gather-tpvnk"] Mar 20 18:21:19 crc kubenswrapper[4803]: E0320 18:21:19.454102 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8217899f-3d1a-40be-871d-2315f7b6878b" containerName="oc" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.454126 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="8217899f-3d1a-40be-871d-2315f7b6878b" containerName="oc" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.454348 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="8217899f-3d1a-40be-871d-2315f7b6878b" containerName="oc" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.455326 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.463328 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sclvs/must-gather-tpvnk"] Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.472800 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sclvs"/"openshift-service-ca.crt" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.473297 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sclvs"/"kube-root-ca.crt" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.474263 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-sclvs"/"default-dockercfg-k9fkn" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.595311 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72x5v\" (UniqueName: \"kubernetes.io/projected/11de5097-05f5-47c5-a17f-cb331c42fa58-kube-api-access-72x5v\") pod \"must-gather-tpvnk\" (UID: \"11de5097-05f5-47c5-a17f-cb331c42fa58\") " pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.595694 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11de5097-05f5-47c5-a17f-cb331c42fa58-must-gather-output\") pod \"must-gather-tpvnk\" (UID: \"11de5097-05f5-47c5-a17f-cb331c42fa58\") " pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.697338 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72x5v\" (UniqueName: \"kubernetes.io/projected/11de5097-05f5-47c5-a17f-cb331c42fa58-kube-api-access-72x5v\") pod \"must-gather-tpvnk\" (UID: \"11de5097-05f5-47c5-a17f-cb331c42fa58\") " pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.697406 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11de5097-05f5-47c5-a17f-cb331c42fa58-must-gather-output\") pod \"must-gather-tpvnk\" (UID: \"11de5097-05f5-47c5-a17f-cb331c42fa58\") " pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.697989 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11de5097-05f5-47c5-a17f-cb331c42fa58-must-gather-output\") pod \"must-gather-tpvnk\" (UID: \"11de5097-05f5-47c5-a17f-cb331c42fa58\") " pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.718787 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72x5v\" (UniqueName: \"kubernetes.io/projected/11de5097-05f5-47c5-a17f-cb331c42fa58-kube-api-access-72x5v\") pod \"must-gather-tpvnk\" (UID: \"11de5097-05f5-47c5-a17f-cb331c42fa58\") " pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:21:19 crc kubenswrapper[4803]: I0320 18:21:19.845417 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:21:20 crc kubenswrapper[4803]: I0320 18:21:20.295323 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sclvs/must-gather-tpvnk"] Mar 20 18:21:20 crc kubenswrapper[4803]: I0320 18:21:20.470797 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/must-gather-tpvnk" event={"ID":"11de5097-05f5-47c5-a17f-cb331c42fa58","Type":"ContainerStarted","Data":"8cadcf84a33c119c48bd9851c4eb24e0be6e9dd965cc13a733b38a80dfb29727"} Mar 20 18:21:21 crc kubenswrapper[4803]: I0320 18:21:21.486741 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/must-gather-tpvnk" event={"ID":"11de5097-05f5-47c5-a17f-cb331c42fa58","Type":"ContainerStarted","Data":"21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb"} Mar 20 18:21:21 crc kubenswrapper[4803]: I0320 18:21:21.486987 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/must-gather-tpvnk" event={"ID":"11de5097-05f5-47c5-a17f-cb331c42fa58","Type":"ContainerStarted","Data":"398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d"} Mar 20 18:21:23 crc kubenswrapper[4803]: I0320 18:21:23.848195 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:21:23 crc kubenswrapper[4803]: E0320 18:21:23.848915 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.169278 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sclvs/must-gather-tpvnk" podStartSLOduration=6.169258229 podStartE2EDuration="6.169258229s" podCreationTimestamp="2026-03-20 18:21:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:21:21.506739971 +0000 UTC m=+3891.418332041" watchObservedRunningTime="2026-03-20 18:21:25.169258229 +0000 UTC m=+3895.080850309" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.179333 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sclvs/crc-debug-klcs2"] Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.181121 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.234367 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5zhx\" (UniqueName: \"kubernetes.io/projected/da95623b-8397-4629-aedc-59244f2a693c-kube-api-access-q5zhx\") pod \"crc-debug-klcs2\" (UID: \"da95623b-8397-4629-aedc-59244f2a693c\") " pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.234517 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da95623b-8397-4629-aedc-59244f2a693c-host\") pod \"crc-debug-klcs2\" (UID: \"da95623b-8397-4629-aedc-59244f2a693c\") " pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.336374 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5zhx\" (UniqueName: \"kubernetes.io/projected/da95623b-8397-4629-aedc-59244f2a693c-kube-api-access-q5zhx\") pod \"crc-debug-klcs2\" (UID: \"da95623b-8397-4629-aedc-59244f2a693c\") " pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.336564 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da95623b-8397-4629-aedc-59244f2a693c-host\") pod \"crc-debug-klcs2\" (UID: \"da95623b-8397-4629-aedc-59244f2a693c\") " pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.336769 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da95623b-8397-4629-aedc-59244f2a693c-host\") pod \"crc-debug-klcs2\" (UID: \"da95623b-8397-4629-aedc-59244f2a693c\") " pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.356148 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5zhx\" (UniqueName: \"kubernetes.io/projected/da95623b-8397-4629-aedc-59244f2a693c-kube-api-access-q5zhx\") pod \"crc-debug-klcs2\" (UID: \"da95623b-8397-4629-aedc-59244f2a693c\") " pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:21:25 crc kubenswrapper[4803]: I0320 18:21:25.502461 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:21:26 crc kubenswrapper[4803]: I0320 18:21:26.557195 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/crc-debug-klcs2" event={"ID":"da95623b-8397-4629-aedc-59244f2a693c","Type":"ContainerStarted","Data":"1c2a0542c608ec433df9145639ddd273d3cdd7b87f40a7194f544159a08861d2"} Mar 20 18:21:26 crc kubenswrapper[4803]: I0320 18:21:26.559132 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/crc-debug-klcs2" event={"ID":"da95623b-8397-4629-aedc-59244f2a693c","Type":"ContainerStarted","Data":"d8f3c8b92c8c4f41d670761b2e9c4bd6a1ba32d5563b5561dd6f5508595e3ae7"} Mar 20 18:21:26 crc kubenswrapper[4803]: I0320 18:21:26.580700 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sclvs/crc-debug-klcs2" podStartSLOduration=1.580682632 podStartE2EDuration="1.580682632s" podCreationTimestamp="2026-03-20 18:21:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 18:21:26.574417856 +0000 UTC m=+3896.486009946" watchObservedRunningTime="2026-03-20 18:21:26.580682632 +0000 UTC m=+3896.492274702" Mar 20 18:21:36 crc kubenswrapper[4803]: I0320 18:21:36.848835 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:21:36 crc kubenswrapper[4803]: E0320 18:21:36.849505 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:21:49 crc kubenswrapper[4803]: I0320 18:21:49.849044 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:21:49 crc kubenswrapper[4803]: E0320 18:21:49.849781 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.156033 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567182-52597"] Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.158050 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-52597" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.160388 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.161165 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.161638 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.183046 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-52597"] Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.274410 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5c7\" (UniqueName: \"kubernetes.io/projected/ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64-kube-api-access-nq5c7\") pod \"auto-csr-approver-29567182-52597\" (UID: \"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64\") " pod="openshift-infra/auto-csr-approver-29567182-52597" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.376409 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5c7\" (UniqueName: \"kubernetes.io/projected/ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64-kube-api-access-nq5c7\") pod \"auto-csr-approver-29567182-52597\" (UID: \"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64\") " pod="openshift-infra/auto-csr-approver-29567182-52597" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.398211 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5c7\" (UniqueName: \"kubernetes.io/projected/ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64-kube-api-access-nq5c7\") pod \"auto-csr-approver-29567182-52597\" (UID: \"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64\") " pod="openshift-infra/auto-csr-approver-29567182-52597" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.489090 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-52597" Mar 20 18:22:00 crc kubenswrapper[4803]: I0320 18:22:00.859187 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:22:00 crc kubenswrapper[4803]: E0320 18:22:00.859464 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:22:01 crc kubenswrapper[4803]: I0320 18:22:01.073313 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-52597"] Mar 20 18:22:01 crc kubenswrapper[4803]: I0320 18:22:01.079408 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:22:01 crc kubenswrapper[4803]: I0320 18:22:01.839344 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-52597" event={"ID":"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64","Type":"ContainerStarted","Data":"3c44455996631f80ee4527646edc8c5f426cf9f032dcc3f7b1e6d25647311627"} Mar 20 18:22:02 crc kubenswrapper[4803]: I0320 18:22:02.858947 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-52597" event={"ID":"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64","Type":"ContainerStarted","Data":"e241cc7525607ecf1217ca5ce1cb4dac8ab78cf8efc4a9e899746cb2fdbfdabd"} Mar 20 18:22:02 crc kubenswrapper[4803]: I0320 18:22:02.876223 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29567182-52597" podStartSLOduration=1.713385382 podStartE2EDuration="2.876203203s" podCreationTimestamp="2026-03-20 18:22:00 +0000 UTC" firstStartedPulling="2026-03-20 18:22:01.078868718 +0000 UTC m=+3930.990460788" lastFinishedPulling="2026-03-20 18:22:02.241686539 +0000 UTC m=+3932.153278609" observedRunningTime="2026-03-20 18:22:02.867689604 +0000 UTC m=+3932.779281694" watchObservedRunningTime="2026-03-20 18:22:02.876203203 +0000 UTC m=+3932.787795273" Mar 20 18:22:03 crc kubenswrapper[4803]: I0320 18:22:03.887455 4803 generic.go:334] "Generic (PLEG): container finished" podID="ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64" containerID="e241cc7525607ecf1217ca5ce1cb4dac8ab78cf8efc4a9e899746cb2fdbfdabd" exitCode=0 Mar 20 18:22:03 crc kubenswrapper[4803]: I0320 18:22:03.888145 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-52597" event={"ID":"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64","Type":"ContainerDied","Data":"e241cc7525607ecf1217ca5ce1cb4dac8ab78cf8efc4a9e899746cb2fdbfdabd"} Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.377279 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-52597" Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.490757 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq5c7\" (UniqueName: \"kubernetes.io/projected/ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64-kube-api-access-nq5c7\") pod \"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64\" (UID: \"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64\") " Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.496999 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64-kube-api-access-nq5c7" (OuterVolumeSpecName: "kube-api-access-nq5c7") pod "ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64" (UID: "ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64"). InnerVolumeSpecName "kube-api-access-nq5c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.592860 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq5c7\" (UniqueName: \"kubernetes.io/projected/ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64-kube-api-access-nq5c7\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.905437 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567182-52597" event={"ID":"ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64","Type":"ContainerDied","Data":"3c44455996631f80ee4527646edc8c5f426cf9f032dcc3f7b1e6d25647311627"} Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.906096 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c44455996631f80ee4527646edc8c5f426cf9f032dcc3f7b1e6d25647311627" Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.905451 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567182-52597" Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.907400 4803 generic.go:334] "Generic (PLEG): container finished" podID="da95623b-8397-4629-aedc-59244f2a693c" containerID="1c2a0542c608ec433df9145639ddd273d3cdd7b87f40a7194f544159a08861d2" exitCode=0 Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.907437 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/crc-debug-klcs2" event={"ID":"da95623b-8397-4629-aedc-59244f2a693c","Type":"ContainerDied","Data":"1c2a0542c608ec433df9145639ddd273d3cdd7b87f40a7194f544159a08861d2"} Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.956787 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-8vdc6"] Mar 20 18:22:05 crc kubenswrapper[4803]: I0320 18:22:05.966921 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567176-8vdc6"] Mar 20 18:22:06 crc kubenswrapper[4803]: I0320 18:22:06.860659 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fbddbf-b60d-44c9-adc8-53b24104e875" path="/var/lib/kubelet/pods/a7fbddbf-b60d-44c9-adc8-53b24104e875/volumes" Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.073630 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.117614 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sclvs/crc-debug-klcs2"] Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.128488 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sclvs/crc-debug-klcs2"] Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.271577 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da95623b-8397-4629-aedc-59244f2a693c-host\") pod \"da95623b-8397-4629-aedc-59244f2a693c\" (UID: \"da95623b-8397-4629-aedc-59244f2a693c\") " Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.271922 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5zhx\" (UniqueName: \"kubernetes.io/projected/da95623b-8397-4629-aedc-59244f2a693c-kube-api-access-q5zhx\") pod \"da95623b-8397-4629-aedc-59244f2a693c\" (UID: \"da95623b-8397-4629-aedc-59244f2a693c\") " Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.271735 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da95623b-8397-4629-aedc-59244f2a693c-host" (OuterVolumeSpecName: "host") pod "da95623b-8397-4629-aedc-59244f2a693c" (UID: "da95623b-8397-4629-aedc-59244f2a693c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.272459 4803 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/da95623b-8397-4629-aedc-59244f2a693c-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.278077 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da95623b-8397-4629-aedc-59244f2a693c-kube-api-access-q5zhx" (OuterVolumeSpecName: "kube-api-access-q5zhx") pod "da95623b-8397-4629-aedc-59244f2a693c" (UID: "da95623b-8397-4629-aedc-59244f2a693c"). InnerVolumeSpecName "kube-api-access-q5zhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.374021 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5zhx\" (UniqueName: \"kubernetes.io/projected/da95623b-8397-4629-aedc-59244f2a693c-kube-api-access-q5zhx\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.924976 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8f3c8b92c8c4f41d670761b2e9c4bd6a1ba32d5563b5561dd6f5508595e3ae7" Mar 20 18:22:07 crc kubenswrapper[4803]: I0320 18:22:07.925070 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-klcs2" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.339708 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sclvs/crc-debug-nqg44"] Mar 20 18:22:08 crc kubenswrapper[4803]: E0320 18:22:08.340090 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64" containerName="oc" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.340110 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64" containerName="oc" Mar 20 18:22:08 crc kubenswrapper[4803]: E0320 18:22:08.340135 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da95623b-8397-4629-aedc-59244f2a693c" containerName="container-00" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.340142 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="da95623b-8397-4629-aedc-59244f2a693c" containerName="container-00" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.340328 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64" containerName="oc" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.340342 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="da95623b-8397-4629-aedc-59244f2a693c" containerName="container-00" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.341129 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.494358 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcg5l\" (UniqueName: \"kubernetes.io/projected/5ce4da1c-99d2-4481-8879-1b310493c60a-kube-api-access-mcg5l\") pod \"crc-debug-nqg44\" (UID: \"5ce4da1c-99d2-4481-8879-1b310493c60a\") " pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.494543 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ce4da1c-99d2-4481-8879-1b310493c60a-host\") pod \"crc-debug-nqg44\" (UID: \"5ce4da1c-99d2-4481-8879-1b310493c60a\") " pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.595746 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ce4da1c-99d2-4481-8879-1b310493c60a-host\") pod \"crc-debug-nqg44\" (UID: \"5ce4da1c-99d2-4481-8879-1b310493c60a\") " pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.596096 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcg5l\" (UniqueName: \"kubernetes.io/projected/5ce4da1c-99d2-4481-8879-1b310493c60a-kube-api-access-mcg5l\") pod \"crc-debug-nqg44\" (UID: \"5ce4da1c-99d2-4481-8879-1b310493c60a\") " pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.595893 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ce4da1c-99d2-4481-8879-1b310493c60a-host\") pod \"crc-debug-nqg44\" (UID: \"5ce4da1c-99d2-4481-8879-1b310493c60a\") " pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.610734 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcg5l\" (UniqueName: \"kubernetes.io/projected/5ce4da1c-99d2-4481-8879-1b310493c60a-kube-api-access-mcg5l\") pod \"crc-debug-nqg44\" (UID: \"5ce4da1c-99d2-4481-8879-1b310493c60a\") " pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.656330 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.858487 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da95623b-8397-4629-aedc-59244f2a693c" path="/var/lib/kubelet/pods/da95623b-8397-4629-aedc-59244f2a693c/volumes" Mar 20 18:22:08 crc kubenswrapper[4803]: I0320 18:22:08.936926 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/crc-debug-nqg44" event={"ID":"5ce4da1c-99d2-4481-8879-1b310493c60a","Type":"ContainerStarted","Data":"dadc2fa6e1d48e99357ed6c86161d38343d1404749bcb6a4895e71f79add77a3"} Mar 20 18:22:09 crc kubenswrapper[4803]: I0320 18:22:09.945586 4803 generic.go:334] "Generic (PLEG): container finished" podID="5ce4da1c-99d2-4481-8879-1b310493c60a" containerID="f19fc7a53eba2c4eefc258478c25aa87328ee74bbc3137448d2f329c8c32fe13" exitCode=0 Mar 20 18:22:09 crc kubenswrapper[4803]: I0320 18:22:09.945882 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/crc-debug-nqg44" event={"ID":"5ce4da1c-99d2-4481-8879-1b310493c60a","Type":"ContainerDied","Data":"f19fc7a53eba2c4eefc258478c25aa87328ee74bbc3137448d2f329c8c32fe13"} Mar 20 18:22:10 crc kubenswrapper[4803]: I0320 18:22:10.356135 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sclvs/crc-debug-nqg44"] Mar 20 18:22:10 crc kubenswrapper[4803]: I0320 18:22:10.365298 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sclvs/crc-debug-nqg44"] Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.065657 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.143751 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcg5l\" (UniqueName: \"kubernetes.io/projected/5ce4da1c-99d2-4481-8879-1b310493c60a-kube-api-access-mcg5l\") pod \"5ce4da1c-99d2-4481-8879-1b310493c60a\" (UID: \"5ce4da1c-99d2-4481-8879-1b310493c60a\") " Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.144882 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ce4da1c-99d2-4481-8879-1b310493c60a-host\") pod \"5ce4da1c-99d2-4481-8879-1b310493c60a\" (UID: \"5ce4da1c-99d2-4481-8879-1b310493c60a\") " Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.145004 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ce4da1c-99d2-4481-8879-1b310493c60a-host" (OuterVolumeSpecName: "host") pod "5ce4da1c-99d2-4481-8879-1b310493c60a" (UID: "5ce4da1c-99d2-4481-8879-1b310493c60a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.145586 4803 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ce4da1c-99d2-4481-8879-1b310493c60a-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.150843 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce4da1c-99d2-4481-8879-1b310493c60a-kube-api-access-mcg5l" (OuterVolumeSpecName: "kube-api-access-mcg5l") pod "5ce4da1c-99d2-4481-8879-1b310493c60a" (UID: "5ce4da1c-99d2-4481-8879-1b310493c60a"). InnerVolumeSpecName "kube-api-access-mcg5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.247255 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcg5l\" (UniqueName: \"kubernetes.io/projected/5ce4da1c-99d2-4481-8879-1b310493c60a-kube-api-access-mcg5l\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.817806 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sclvs/crc-debug-wdvr2"] Mar 20 18:22:11 crc kubenswrapper[4803]: E0320 18:22:11.818553 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce4da1c-99d2-4481-8879-1b310493c60a" containerName="container-00" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.818572 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce4da1c-99d2-4481-8879-1b310493c60a" containerName="container-00" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.818779 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce4da1c-99d2-4481-8879-1b310493c60a" containerName="container-00" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.819441 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.847914 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.956337 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-host\") pod \"crc-debug-wdvr2\" (UID: \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\") " pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.956385 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8nqz\" (UniqueName: \"kubernetes.io/projected/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-kube-api-access-f8nqz\") pod \"crc-debug-wdvr2\" (UID: \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\") " pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.976031 4803 scope.go:117] "RemoveContainer" containerID="f19fc7a53eba2c4eefc258478c25aa87328ee74bbc3137448d2f329c8c32fe13" Mar 20 18:22:11 crc kubenswrapper[4803]: I0320 18:22:11.976185 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-nqg44" Mar 20 18:22:12 crc kubenswrapper[4803]: I0320 18:22:12.058976 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-host\") pod \"crc-debug-wdvr2\" (UID: \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\") " pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:12 crc kubenswrapper[4803]: I0320 18:22:12.059114 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8nqz\" (UniqueName: \"kubernetes.io/projected/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-kube-api-access-f8nqz\") pod \"crc-debug-wdvr2\" (UID: \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\") " pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:12 crc kubenswrapper[4803]: I0320 18:22:12.059127 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-host\") pod \"crc-debug-wdvr2\" (UID: \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\") " pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:12 crc kubenswrapper[4803]: I0320 18:22:12.077956 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8nqz\" (UniqueName: \"kubernetes.io/projected/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-kube-api-access-f8nqz\") pod \"crc-debug-wdvr2\" (UID: \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\") " pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:12 crc kubenswrapper[4803]: I0320 18:22:12.136663 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:12 crc kubenswrapper[4803]: W0320 18:22:12.166831 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc259c770_f3c7_4eab_a7e5_3f3a74f923fa.slice/crio-96dea21e17d1b6a41fc32e8b9eda438caa8eb3ece553078887d3362f5a497f6d WatchSource:0}: Error finding container 96dea21e17d1b6a41fc32e8b9eda438caa8eb3ece553078887d3362f5a497f6d: Status 404 returned error can't find the container with id 96dea21e17d1b6a41fc32e8b9eda438caa8eb3ece553078887d3362f5a497f6d Mar 20 18:22:12 crc kubenswrapper[4803]: I0320 18:22:12.871464 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ce4da1c-99d2-4481-8879-1b310493c60a" path="/var/lib/kubelet/pods/5ce4da1c-99d2-4481-8879-1b310493c60a/volumes" Mar 20 18:22:12 crc kubenswrapper[4803]: I0320 18:22:12.996280 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"2fdd16f5246edd6c5e9048a1a65ae17cc95dfdb32f5019a55bd0a73d20083db4"} Mar 20 18:22:13 crc kubenswrapper[4803]: I0320 18:22:13.010751 4803 generic.go:334] "Generic (PLEG): container finished" podID="c259c770-f3c7-4eab-a7e5-3f3a74f923fa" containerID="c4251d89bd0aa5a20b21d562817ada1495c2b96ab7a7702baa2553ca56be9ae3" exitCode=0 Mar 20 18:22:13 crc kubenswrapper[4803]: I0320 18:22:13.010909 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/crc-debug-wdvr2" event={"ID":"c259c770-f3c7-4eab-a7e5-3f3a74f923fa","Type":"ContainerDied","Data":"c4251d89bd0aa5a20b21d562817ada1495c2b96ab7a7702baa2553ca56be9ae3"} Mar 20 18:22:13 crc kubenswrapper[4803]: I0320 18:22:13.010960 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/crc-debug-wdvr2" event={"ID":"c259c770-f3c7-4eab-a7e5-3f3a74f923fa","Type":"ContainerStarted","Data":"96dea21e17d1b6a41fc32e8b9eda438caa8eb3ece553078887d3362f5a497f6d"} Mar 20 18:22:13 crc kubenswrapper[4803]: I0320 18:22:13.078997 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sclvs/crc-debug-wdvr2"] Mar 20 18:22:13 crc kubenswrapper[4803]: I0320 18:22:13.091295 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sclvs/crc-debug-wdvr2"] Mar 20 18:22:14 crc kubenswrapper[4803]: I0320 18:22:14.138942 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:14 crc kubenswrapper[4803]: I0320 18:22:14.306090 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-host\") pod \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\" (UID: \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\") " Mar 20 18:22:14 crc kubenswrapper[4803]: I0320 18:22:14.306238 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-host" (OuterVolumeSpecName: "host") pod "c259c770-f3c7-4eab-a7e5-3f3a74f923fa" (UID: "c259c770-f3c7-4eab-a7e5-3f3a74f923fa"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 18:22:14 crc kubenswrapper[4803]: I0320 18:22:14.306290 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8nqz\" (UniqueName: \"kubernetes.io/projected/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-kube-api-access-f8nqz\") pod \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\" (UID: \"c259c770-f3c7-4eab-a7e5-3f3a74f923fa\") " Mar 20 18:22:14 crc kubenswrapper[4803]: I0320 18:22:14.306797 4803 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-host\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:14 crc kubenswrapper[4803]: I0320 18:22:14.312780 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-kube-api-access-f8nqz" (OuterVolumeSpecName: "kube-api-access-f8nqz") pod "c259c770-f3c7-4eab-a7e5-3f3a74f923fa" (UID: "c259c770-f3c7-4eab-a7e5-3f3a74f923fa"). InnerVolumeSpecName "kube-api-access-f8nqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:22:14 crc kubenswrapper[4803]: I0320 18:22:14.409812 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8nqz\" (UniqueName: \"kubernetes.io/projected/c259c770-f3c7-4eab-a7e5-3f3a74f923fa-kube-api-access-f8nqz\") on node \"crc\" DevicePath \"\"" Mar 20 18:22:14 crc kubenswrapper[4803]: I0320 18:22:14.863270 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c259c770-f3c7-4eab-a7e5-3f3a74f923fa" path="/var/lib/kubelet/pods/c259c770-f3c7-4eab-a7e5-3f3a74f923fa/volumes" Mar 20 18:22:15 crc kubenswrapper[4803]: I0320 18:22:15.055795 4803 scope.go:117] "RemoveContainer" containerID="c4251d89bd0aa5a20b21d562817ada1495c2b96ab7a7702baa2553ca56be9ae3" Mar 20 18:22:15 crc kubenswrapper[4803]: I0320 18:22:15.055906 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/crc-debug-wdvr2" Mar 20 18:22:37 crc kubenswrapper[4803]: I0320 18:22:37.838602 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-648ff8c756-h8wjt_deeb9bec-8bcb-48ef-ba67-b5772825f753/barbican-api/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.070586 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-648ff8c756-h8wjt_deeb9bec-8bcb-48ef-ba67-b5772825f753/barbican-api-log/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.151004 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b99997f8b-mzkpj_726567e8-cdfd-4fe4-985f-1cf10a787994/barbican-keystone-listener/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.152792 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b99997f8b-mzkpj_726567e8-cdfd-4fe4-985f-1cf10a787994/barbican-keystone-listener-log/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.335721 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bfd9f5f6f-xmdsr_43090279-19af-4393-ab9a-1092aae61875/barbican-worker-log/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.342603 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-6bfd9f5f6f-xmdsr_43090279-19af-4393-ab9a-1092aae61875/barbican-worker/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.600365 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-m9fhn_0c2a599d-17eb-4116-8b3e-a9adc8a7568b/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.646330 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a7417bc-8901-4f01-ae88-5b304c7371a9/ceilometer-central-agent/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.723974 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a7417bc-8901-4f01-ae88-5b304c7371a9/ceilometer-notification-agent/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.809577 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a7417bc-8901-4f01-ae88-5b304c7371a9/proxy-httpd/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.832962 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0a7417bc-8901-4f01-ae88-5b304c7371a9/sg-core/0.log" Mar 20 18:22:38 crc kubenswrapper[4803]: I0320 18:22:38.980473 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_035c521a-8bdf-4489-a429-3629df54ca84/cinder-api/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.042232 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_035c521a-8bdf-4489-a429-3629df54ca84/cinder-api-log/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.160981 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_78208796-d6b5-472e-9f4b-0f582d5bcfc9/cinder-scheduler/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.274898 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_78208796-d6b5-472e-9f4b-0f582d5bcfc9/probe/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.401352 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-c2xb5_04e7436c-88e7-4d1b-b078-cbee6adf422d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.611782 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5ztvh_7bb67428-717c-47fc-9ab1-b94a5c502298/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.638569 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-hh5vh_6516522c-430f-476f-8471-d5b39263571f/init/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.883384 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-hh5vh_6516522c-430f-476f-8471-d5b39263571f/dnsmasq-dns/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.930069 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-hh5vh_6516522c-430f-476f-8471-d5b39263571f/init/0.log" Mar 20 18:22:39 crc kubenswrapper[4803]: I0320 18:22:39.947796 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gjrjn_c2c45e4b-27e5-4614-875f-838444583617/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:40 crc kubenswrapper[4803]: I0320 18:22:40.213997 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_de8af7e5-2a44-4caa-883c-7ef2027e69c4/glance-httpd/0.log" Mar 20 18:22:40 crc kubenswrapper[4803]: I0320 18:22:40.266956 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_de8af7e5-2a44-4caa-883c-7ef2027e69c4/glance-log/0.log" Mar 20 18:22:40 crc kubenswrapper[4803]: I0320 18:22:40.445466 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_27ce9fd0-5867-4d7f-aec4-70784b898289/glance-log/0.log" Mar 20 18:22:40 crc kubenswrapper[4803]: I0320 18:22:40.451249 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_27ce9fd0-5867-4d7f-aec4-70784b898289/glance-httpd/0.log" Mar 20 18:22:40 crc kubenswrapper[4803]: I0320 18:22:40.631364 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-596cfc5b56-w5pbk_76a924c8-a380-4ee2-a6ce-ac77f0979f24/horizon/0.log" Mar 20 18:22:40 crc kubenswrapper[4803]: I0320 18:22:40.709370 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-fh4nf_11eacc86-5400-4614-bfa1-353a4c9a4ef8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:41 crc kubenswrapper[4803]: I0320 18:22:41.127640 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-596cfc5b56-w5pbk_76a924c8-a380-4ee2-a6ce-ac77f0979f24/horizon-log/0.log" Mar 20 18:22:41 crc kubenswrapper[4803]: I0320 18:22:41.212501 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-26xq7_de5b2da1-3a72-4b62-98a5-71352eb71c90/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:41 crc kubenswrapper[4803]: I0320 18:22:41.316889 4803 scope.go:117] "RemoveContainer" containerID="7da7ca8dd32d79d4ff16f10cb2c16e28be00b8f729a66a804391ed4bbc74cec7" Mar 20 18:22:41 crc kubenswrapper[4803]: I0320 18:22:41.323119 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29567161-685pt_5b706f4a-4386-4911-a201-5cbf5e7bd916/keystone-cron/0.log" Mar 20 18:22:41 crc kubenswrapper[4803]: I0320 18:22:41.400129 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-85847589bb-9pbbf_42bf225e-cdca-4bc8-922d-8ff2bcb6ff17/keystone-api/0.log" Mar 20 18:22:41 crc kubenswrapper[4803]: I0320 18:22:41.492840 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e13d6de9-6ef6-4194-98da-c8fee814f4d1/kube-state-metrics/0.log" Mar 20 18:22:41 crc kubenswrapper[4803]: I0320 18:22:41.979795 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b69588b57-phch4_9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6/neutron-api/0.log" Mar 20 18:22:42 crc kubenswrapper[4803]: I0320 18:22:42.056743 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6b69588b57-phch4_9a750f8a-dfa6-41fa-91c0-8d5c4eb8feb6/neutron-httpd/0.log" Mar 20 18:22:42 crc kubenswrapper[4803]: I0320 18:22:42.175185 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-j4qdz_f644be5e-0ec5-499a-a42d-b4381159e310/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:42 crc kubenswrapper[4803]: I0320 18:22:42.215468 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mpgwc_6aabf807-6d9b-4e6a-99db-ddfda9979b25/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:42 crc kubenswrapper[4803]: I0320 18:22:42.696549 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b56a359b-06fc-40e2-b128-c2427461160a/nova-api-log/0.log" Mar 20 18:22:42 crc kubenswrapper[4803]: I0320 18:22:42.698055 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ad2e7092-e2b5-4f0f-bde6-892cb3660837/nova-cell0-conductor-conductor/0.log" Mar 20 18:22:43 crc kubenswrapper[4803]: I0320 18:22:43.083451 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_76a50897-11fb-467d-8b45-dde2ad95b1fd/nova-cell1-conductor-conductor/0.log" Mar 20 18:22:43 crc kubenswrapper[4803]: I0320 18:22:43.090422 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_62cc6e29-7f2d-4c3d-b32c-4b906520eded/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 18:22:43 crc kubenswrapper[4803]: I0320 18:22:43.124047 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b56a359b-06fc-40e2-b128-c2427461160a/nova-api-api/0.log" Mar 20 18:22:43 crc kubenswrapper[4803]: I0320 18:22:43.457921 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9461d82d-3e47-4dea-889c-a82c7f4a97b4/nova-metadata-log/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.016185 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-9nrx2_0b06d67e-e61d-499b-bce7-41b3f0b3b509/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.029191 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9461d82d-3e47-4dea-889c-a82c7f4a97b4/nova-metadata-metadata/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.134115 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_51598f47-53eb-4d5d-917d-3655d7e200e8/nova-scheduler-scheduler/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.204478 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f1b71013-6f7a-4559-a3bc-af90c284cada/mysql-bootstrap/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.423095 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f1b71013-6f7a-4559-a3bc-af90c284cada/galera/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.459402 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_be331baf-1bef-41ab-ac10-b8686ecb5a30/mysql-bootstrap/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.597672 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f1b71013-6f7a-4559-a3bc-af90c284cada/mysql-bootstrap/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.643719 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_be331baf-1bef-41ab-ac10-b8686ecb5a30/mysql-bootstrap/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.702543 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_be331baf-1bef-41ab-ac10-b8686ecb5a30/galera/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.831694 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_d15f9821-6db3-4c08-b731-d0c7349b4076/openstackclient/0.log" Mar 20 18:22:44 crc kubenswrapper[4803]: I0320 18:22:44.997297 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-56z85_1d21c995-a420-4ac7-8cd9-c186be9e4ba0/ovn-controller/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.085112 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-kgbvn_7cc9f89e-5d0f-4f92-93fa-c7a0133baf05/openstack-network-exporter/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.219093 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l7cc6_d0bcca9a-5da9-4ffc-897f-e3a0ae093324/ovsdb-server-init/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.427893 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l7cc6_d0bcca9a-5da9-4ffc-897f-e3a0ae093324/ovsdb-server-init/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.432949 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l7cc6_d0bcca9a-5da9-4ffc-897f-e3a0ae093324/ovsdb-server/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.443966 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l7cc6_d0bcca9a-5da9-4ffc-897f-e3a0ae093324/ovs-vswitchd/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.625611 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_02135c00-5c50-45c9-a206-85f9e60d9c6e/openstack-network-exporter/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.724966 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_02135c00-5c50-45c9-a206-85f9e60d9c6e/ovn-northd/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.726370 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-knxqc_6bd23d21-aeac-4394-b83d-befbd825d5ce/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:45 crc kubenswrapper[4803]: I0320 18:22:45.872006 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ab4e6fd1-195c-4ff0-8288-14454e4ea4f1/openstack-network-exporter/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.031807 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ab4e6fd1-195c-4ff0-8288-14454e4ea4f1/ovsdbserver-nb/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.047940 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7c9af9eb-0645-4fe0-a558-6ae86595685e/openstack-network-exporter/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.149161 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_7c9af9eb-0645-4fe0-a558-6ae86595685e/ovsdbserver-sb/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.334209 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5984b77b84-4dhqv_d3ebc450-0e00-492b-a5ab-f02c63aa4071/placement-api/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.362840 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5984b77b84-4dhqv_d3ebc450-0e00-492b-a5ab-f02c63aa4071/placement-log/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.435127 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6450d307-a4cf-4d3c-acdd-31a50aec6109/setup-container/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.655640 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6450d307-a4cf-4d3c-acdd-31a50aec6109/rabbitmq/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.663587 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6450d307-a4cf-4d3c-acdd-31a50aec6109/setup-container/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.713715 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72227685-667c-47b8-aedb-0329dd683bc0/setup-container/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.945809 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72227685-667c-47b8-aedb-0329dd683bc0/setup-container/0.log" Mar 20 18:22:46 crc kubenswrapper[4803]: I0320 18:22:46.951557 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72227685-667c-47b8-aedb-0329dd683bc0/rabbitmq/0.log" Mar 20 18:22:47 crc kubenswrapper[4803]: I0320 18:22:47.017052 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-pjckp_6091d3e2-8164-4c5c-b5b9-9299dbe203d5/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:47 crc kubenswrapper[4803]: I0320 18:22:47.256091 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gxnv5_be466e4c-5034-4784-9c39-a390b28adb2e/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:47 crc kubenswrapper[4803]: I0320 18:22:47.271684 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-qtns4_5daa4111-ac59-4390-b425-d56ac571c768/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:47 crc kubenswrapper[4803]: I0320 18:22:47.600611 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gfh5h_2aed2dd6-8792-412c-972f-6e7e4e4bae0e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:47 crc kubenswrapper[4803]: I0320 18:22:47.649760 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-54x78_8a143d0c-3c8c-4426-8b98-1309a281aaf8/ssh-known-hosts-edpm-deployment/0.log" Mar 20 18:22:47 crc kubenswrapper[4803]: I0320 18:22:47.888779 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5598996667-w8j76_8b611951-3d24-49d2-a5bc-18d41e478610/proxy-server/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.011247 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5598996667-w8j76_8b611951-3d24-49d2-a5bc-18d41e478610/proxy-httpd/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.172124 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2d9wv_e846bdbc-0d6d-4dc2-9a0b-d188913b5eda/swift-ring-rebalance/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.234597 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/account-auditor/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.299499 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/account-reaper/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.376285 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/account-replicator/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.386459 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/account-server/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.438001 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/container-auditor/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.534444 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/container-replicator/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.580830 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/container-server/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.600104 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/container-updater/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.696118 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-auditor/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.743401 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-expirer/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.795949 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-server/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.829456 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-replicator/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.904734 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/object-updater/0.log" Mar 20 18:22:48 crc kubenswrapper[4803]: I0320 18:22:48.917886 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/rsync/0.log" Mar 20 18:22:49 crc kubenswrapper[4803]: I0320 18:22:49.016638 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8d6ece75-fa3d-4695-ada9-6c5ec4b580a7/swift-recon-cron/0.log" Mar 20 18:22:49 crc kubenswrapper[4803]: I0320 18:22:49.266697 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_6ba5f719-6967-43a1-b544-c27baf20c15b/tempest-tests-tempest-tests-runner/0.log" Mar 20 18:22:49 crc kubenswrapper[4803]: I0320 18:22:49.377953 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6dbe4bc0-ac8a-4ff5-a985-4e1010782ce6/test-operator-logs-container/0.log" Mar 20 18:22:49 crc kubenswrapper[4803]: I0320 18:22:49.549373 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-6hx8w_149f9011-aff3-4d4a-ac40-d1325e1bdad0/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:49 crc kubenswrapper[4803]: I0320 18:22:49.554730 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kq6rr_64a2f129-704a-4165-88ee-2cab50cc59d9/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 18:22:59 crc kubenswrapper[4803]: I0320 18:22:59.888751 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cd627740-5358-468a-bb90-21d52992a407/memcached/0.log" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.236234 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8t6bf"] Mar 20 18:23:02 crc kubenswrapper[4803]: E0320 18:23:02.238110 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c259c770-f3c7-4eab-a7e5-3f3a74f923fa" containerName="container-00" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.238129 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="c259c770-f3c7-4eab-a7e5-3f3a74f923fa" containerName="container-00" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.238328 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="c259c770-f3c7-4eab-a7e5-3f3a74f923fa" containerName="container-00" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.239787 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.251217 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8t6bf"] Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.350244 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-utilities\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.350314 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-catalog-content\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.350342 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6hz4\" (UniqueName: \"kubernetes.io/projected/4c0383c9-c487-4fec-80b1-a10258f91a9c-kube-api-access-v6hz4\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.452416 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-utilities\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.452481 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-catalog-content\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.452509 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6hz4\" (UniqueName: \"kubernetes.io/projected/4c0383c9-c487-4fec-80b1-a10258f91a9c-kube-api-access-v6hz4\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.452985 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-utilities\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.453051 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-catalog-content\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.474637 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6hz4\" (UniqueName: \"kubernetes.io/projected/4c0383c9-c487-4fec-80b1-a10258f91a9c-kube-api-access-v6hz4\") pod \"redhat-operators-8t6bf\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:02 crc kubenswrapper[4803]: I0320 18:23:02.562312 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:03 crc kubenswrapper[4803]: I0320 18:23:03.037957 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8t6bf"] Mar 20 18:23:03 crc kubenswrapper[4803]: I0320 18:23:03.466251 4803 generic.go:334] "Generic (PLEG): container finished" podID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerID="6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7" exitCode=0 Mar 20 18:23:03 crc kubenswrapper[4803]: I0320 18:23:03.466336 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t6bf" event={"ID":"4c0383c9-c487-4fec-80b1-a10258f91a9c","Type":"ContainerDied","Data":"6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7"} Mar 20 18:23:03 crc kubenswrapper[4803]: I0320 18:23:03.467442 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t6bf" event={"ID":"4c0383c9-c487-4fec-80b1-a10258f91a9c","Type":"ContainerStarted","Data":"8fef4ef90ec6137c7acee85b78ff1609c6077ebac158c398c521742e888f830c"} Mar 20 18:23:04 crc kubenswrapper[4803]: I0320 18:23:04.477607 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t6bf" event={"ID":"4c0383c9-c487-4fec-80b1-a10258f91a9c","Type":"ContainerStarted","Data":"6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542"} Mar 20 18:23:06 crc kubenswrapper[4803]: I0320 18:23:06.496412 4803 generic.go:334] "Generic (PLEG): container finished" podID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerID="6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542" exitCode=0 Mar 20 18:23:06 crc kubenswrapper[4803]: I0320 18:23:06.496465 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t6bf" event={"ID":"4c0383c9-c487-4fec-80b1-a10258f91a9c","Type":"ContainerDied","Data":"6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542"} Mar 20 18:23:07 crc kubenswrapper[4803]: I0320 18:23:07.508662 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t6bf" event={"ID":"4c0383c9-c487-4fec-80b1-a10258f91a9c","Type":"ContainerStarted","Data":"069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b"} Mar 20 18:23:07 crc kubenswrapper[4803]: I0320 18:23:07.533727 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8t6bf" podStartSLOduration=2.038276557 podStartE2EDuration="5.533706016s" podCreationTimestamp="2026-03-20 18:23:02 +0000 UTC" firstStartedPulling="2026-03-20 18:23:03.46803753 +0000 UTC m=+3993.379629600" lastFinishedPulling="2026-03-20 18:23:06.963466989 +0000 UTC m=+3996.875059059" observedRunningTime="2026-03-20 18:23:07.526957586 +0000 UTC m=+3997.438549696" watchObservedRunningTime="2026-03-20 18:23:07.533706016 +0000 UTC m=+3997.445298096" Mar 20 18:23:12 crc kubenswrapper[4803]: I0320 18:23:12.563371 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:12 crc kubenswrapper[4803]: I0320 18:23:12.564207 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:13 crc kubenswrapper[4803]: I0320 18:23:13.616866 4803 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8t6bf" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="registry-server" probeResult="failure" output=< Mar 20 18:23:13 crc kubenswrapper[4803]: timeout: failed to connect service ":50051" within 1s Mar 20 18:23:13 crc kubenswrapper[4803]: > Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.689415 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gtd94"] Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.694038 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.699006 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gtd94"] Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.849628 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtbvm\" (UniqueName: \"kubernetes.io/projected/fdd8c6e2-09fd-4e27-b428-8d80976baefa-kube-api-access-rtbvm\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.849743 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-catalog-content\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.850354 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-utilities\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.888359 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zdc7v"] Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.891490 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.902032 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdc7v"] Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.952585 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-utilities\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.952726 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtbvm\" (UniqueName: \"kubernetes.io/projected/fdd8c6e2-09fd-4e27-b428-8d80976baefa-kube-api-access-rtbvm\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.952788 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-catalog-content\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.952856 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-utilities\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.953506 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd6gx\" (UniqueName: \"kubernetes.io/projected/46527924-296c-4c95-a60a-42bf5f170ed3-kube-api-access-vd6gx\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.953903 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-catalog-content\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.953725 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-utilities\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.954151 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-catalog-content\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:17 crc kubenswrapper[4803]: I0320 18:23:17.989670 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtbvm\" (UniqueName: \"kubernetes.io/projected/fdd8c6e2-09fd-4e27-b428-8d80976baefa-kube-api-access-rtbvm\") pod \"certified-operators-gtd94\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.031774 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.055621 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-catalog-content\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.055698 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-utilities\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.055740 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd6gx\" (UniqueName: \"kubernetes.io/projected/46527924-296c-4c95-a60a-42bf5f170ed3-kube-api-access-vd6gx\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.056292 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-catalog-content\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.056545 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-utilities\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.076447 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd6gx\" (UniqueName: \"kubernetes.io/projected/46527924-296c-4c95-a60a-42bf5f170ed3-kube-api-access-vd6gx\") pod \"redhat-marketplace-zdc7v\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.212505 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.732761 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gtd94"] Mar 20 18:23:18 crc kubenswrapper[4803]: W0320 18:23:18.734567 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd8c6e2_09fd_4e27_b428_8d80976baefa.slice/crio-9b09591e9466a772a52f8351e1d89050d0f6a400ce61a41f92477424f4b832b7 WatchSource:0}: Error finding container 9b09591e9466a772a52f8351e1d89050d0f6a400ce61a41f92477424f4b832b7: Status 404 returned error can't find the container with id 9b09591e9466a772a52f8351e1d89050d0f6a400ce61a41f92477424f4b832b7 Mar 20 18:23:18 crc kubenswrapper[4803]: W0320 18:23:18.836447 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46527924_296c_4c95_a60a_42bf5f170ed3.slice/crio-81ef9c5df9b653828fa2282f0a9ea5ba490bfe13424c44556f238c01a6c7cf51 WatchSource:0}: Error finding container 81ef9c5df9b653828fa2282f0a9ea5ba490bfe13424c44556f238c01a6c7cf51: Status 404 returned error can't find the container with id 81ef9c5df9b653828fa2282f0a9ea5ba490bfe13424c44556f238c01a6c7cf51 Mar 20 18:23:18 crc kubenswrapper[4803]: I0320 18:23:18.837786 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdc7v"] Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.047916 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/util/0.log" Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.147911 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/util/0.log" Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.202598 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/pull/0.log" Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.238342 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/pull/0.log" Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.449230 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/pull/0.log" Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.458664 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/extract/0.log" Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.504691 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9173da642e0709757842966e15477e8d2bc608ce6e8ba054051f5f18f25gqdx_52faafec-fb6f-44ed-8ac3-22022b6fb95e/util/0.log" Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.612683 4803 generic.go:334] "Generic (PLEG): container finished" podID="46527924-296c-4c95-a60a-42bf5f170ed3" containerID="aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713" exitCode=0 Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.612775 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdc7v" event={"ID":"46527924-296c-4c95-a60a-42bf5f170ed3","Type":"ContainerDied","Data":"aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713"} Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.612809 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdc7v" event={"ID":"46527924-296c-4c95-a60a-42bf5f170ed3","Type":"ContainerStarted","Data":"81ef9c5df9b653828fa2282f0a9ea5ba490bfe13424c44556f238c01a6c7cf51"} Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.614902 4803 generic.go:334] "Generic (PLEG): container finished" podID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerID="444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de" exitCode=0 Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.614926 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtd94" event={"ID":"fdd8c6e2-09fd-4e27-b428-8d80976baefa","Type":"ContainerDied","Data":"444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de"} Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.614943 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtd94" event={"ID":"fdd8c6e2-09fd-4e27-b428-8d80976baefa","Type":"ContainerStarted","Data":"9b09591e9466a772a52f8351e1d89050d0f6a400ce61a41f92477424f4b832b7"} Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.787844 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-8gf25_fefd7fa6-fd31-4b28-a7e9-1a4e630070fe/manager/0.log" Mar 20 18:23:19 crc kubenswrapper[4803]: I0320 18:23:19.899810 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-b7sz6_cfe324bc-b8c8-4971-b1d5-ed9df499771f/manager/0.log" Mar 20 18:23:20 crc kubenswrapper[4803]: I0320 18:23:20.086773 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-lxz9k_b3eec30a-a8b9-40b8-a786-16a339efe990/manager/0.log" Mar 20 18:23:20 crc kubenswrapper[4803]: I0320 18:23:20.165841 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-b5q6h_10bf5d23-ea4e-4c19-bad8-f2cb15d93cc5/manager/0.log" Mar 20 18:23:20 crc kubenswrapper[4803]: I0320 18:23:20.353154 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-smx6n_25fa5c58-b9f7-4cd8-b1c4-41c190df40f1/manager/0.log" Mar 20 18:23:20 crc kubenswrapper[4803]: I0320 18:23:20.636034 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdc7v" event={"ID":"46527924-296c-4c95-a60a-42bf5f170ed3","Type":"ContainerStarted","Data":"864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54"} Mar 20 18:23:20 crc kubenswrapper[4803]: I0320 18:23:20.646011 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtd94" event={"ID":"fdd8c6e2-09fd-4e27-b428-8d80976baefa","Type":"ContainerStarted","Data":"ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4"} Mar 20 18:23:20 crc kubenswrapper[4803]: I0320 18:23:20.739628 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-cqwk9_7a5c980d-5ccf-4e9d-9687-3119240ecc15/manager/0.log" Mar 20 18:23:20 crc kubenswrapper[4803]: I0320 18:23:20.998761 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6f8b7f6fdf-qxc8h_0dbcced3-01dd-45bb-8f6f-1733abb4f7db/manager/0.log" Mar 20 18:23:21 crc kubenswrapper[4803]: I0320 18:23:21.032470 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-k8594_5ea17b5b-3363-4c08-a7e3-52cbb4cb5616/manager/0.log" Mar 20 18:23:21 crc kubenswrapper[4803]: I0320 18:23:21.188294 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-wgxn7_23dabc24-9851-4232-8f79-56c8615246c7/manager/0.log" Mar 20 18:23:21 crc kubenswrapper[4803]: I0320 18:23:21.340925 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-7w5w2_524f49fa-3d73-4089-aa72-cbcfdfbed979/manager/0.log" Mar 20 18:23:21 crc kubenswrapper[4803]: I0320 18:23:21.377473 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-6pfw8_b01812c6-e35c-4699-8b7b-192a425bf0ce/manager/0.log" Mar 20 18:23:21 crc kubenswrapper[4803]: I0320 18:23:21.481318 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-xwbd2_75cb9425-bd1a-4311-a85f-76eee943e0e9/manager/0.log" Mar 20 18:23:21 crc kubenswrapper[4803]: I0320 18:23:21.617226 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-9dwg8_ff62aaff-bfdc-400e-b6ee-217356ba1a23/manager/0.log" Mar 20 18:23:21 crc kubenswrapper[4803]: I0320 18:23:21.675043 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-vmmmv_e5270e18-ac4b-4f4d-8a6d-085699034cfe/manager/0.log" Mar 20 18:23:21 crc kubenswrapper[4803]: I0320 18:23:21.889083 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f5vccbg_002615ba-1b17-467b-a536-7a631c5b434e/manager/0.log" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.021921 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b67cc5c9-9mrzt_c7141176-2b0e-4fb1-8c92-a424c769e059/operator/0.log" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.231267 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-sbnxs_e3b2e374-c317-4d96-80e5-f3e274b31ea8/registry-server/0.log" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.493468 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-lktjh_b6b15ce6-e69e-42ad-a356-9802f8750db4/manager/0.log" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.502384 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-22skw_f7f518fa-6e0e-431d-88f4-f835400eec2a/manager/0.log" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.618079 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.718191 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.734824 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-sh6jl_6616ca24-82bf-405a-b77b-8617c65ec76b/manager/0.log" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.839793 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-fgckh_76713a0c-ed94-4e45-a947-e05e3ef0d3d6/manager/0.log" Mar 20 18:23:22 crc kubenswrapper[4803]: I0320 18:23:22.911410 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-b7znw_6409d065-33f9-4e12-806a-b805e4b6e0ea/manager/0.log" Mar 20 18:23:23 crc kubenswrapper[4803]: I0320 18:23:23.031803 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-dxp5d_a8a0f2f5-7910-44c7-969d-204a7d1327d9/manager/0.log" Mar 20 18:23:23 crc kubenswrapper[4803]: I0320 18:23:23.183694 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-56f44579c8-sht64_36ca4d28-1feb-4c48-bba8-d078f85fc37f/manager/0.log" Mar 20 18:23:23 crc kubenswrapper[4803]: I0320 18:23:23.676487 4803 generic.go:334] "Generic (PLEG): container finished" podID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerID="ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4" exitCode=0 Mar 20 18:23:23 crc kubenswrapper[4803]: I0320 18:23:23.676577 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtd94" event={"ID":"fdd8c6e2-09fd-4e27-b428-8d80976baefa","Type":"ContainerDied","Data":"ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4"} Mar 20 18:23:23 crc kubenswrapper[4803]: I0320 18:23:23.678085 4803 generic.go:334] "Generic (PLEG): container finished" podID="46527924-296c-4c95-a60a-42bf5f170ed3" containerID="864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54" exitCode=0 Mar 20 18:23:23 crc kubenswrapper[4803]: I0320 18:23:23.678118 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdc7v" event={"ID":"46527924-296c-4c95-a60a-42bf5f170ed3","Type":"ContainerDied","Data":"864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54"} Mar 20 18:23:24 crc kubenswrapper[4803]: I0320 18:23:24.688994 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtd94" event={"ID":"fdd8c6e2-09fd-4e27-b428-8d80976baefa","Type":"ContainerStarted","Data":"90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8"} Mar 20 18:23:24 crc kubenswrapper[4803]: I0320 18:23:24.692014 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdc7v" event={"ID":"46527924-296c-4c95-a60a-42bf5f170ed3","Type":"ContainerStarted","Data":"98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296"} Mar 20 18:23:24 crc kubenswrapper[4803]: I0320 18:23:24.711740 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gtd94" podStartSLOduration=3.170172791 podStartE2EDuration="7.7117203s" podCreationTimestamp="2026-03-20 18:23:17 +0000 UTC" firstStartedPulling="2026-03-20 18:23:19.616453539 +0000 UTC m=+4009.528045609" lastFinishedPulling="2026-03-20 18:23:24.158001048 +0000 UTC m=+4014.069593118" observedRunningTime="2026-03-20 18:23:24.705539026 +0000 UTC m=+4014.617131106" watchObservedRunningTime="2026-03-20 18:23:24.7117203 +0000 UTC m=+4014.623312370" Mar 20 18:23:24 crc kubenswrapper[4803]: I0320 18:23:24.740689 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zdc7v" podStartSLOduration=3.220932209 podStartE2EDuration="7.740668324s" podCreationTimestamp="2026-03-20 18:23:17 +0000 UTC" firstStartedPulling="2026-03-20 18:23:19.615098161 +0000 UTC m=+4009.526690231" lastFinishedPulling="2026-03-20 18:23:24.134834276 +0000 UTC m=+4014.046426346" observedRunningTime="2026-03-20 18:23:24.731730012 +0000 UTC m=+4014.643322082" watchObservedRunningTime="2026-03-20 18:23:24.740668324 +0000 UTC m=+4014.652260404" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.092279 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8t6bf"] Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.092586 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8t6bf" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="registry-server" containerID="cri-o://069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b" gracePeriod=2 Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.651676 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.702872 4803 generic.go:334] "Generic (PLEG): container finished" podID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerID="069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b" exitCode=0 Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.702937 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t6bf" event={"ID":"4c0383c9-c487-4fec-80b1-a10258f91a9c","Type":"ContainerDied","Data":"069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b"} Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.702978 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8t6bf" event={"ID":"4c0383c9-c487-4fec-80b1-a10258f91a9c","Type":"ContainerDied","Data":"8fef4ef90ec6137c7acee85b78ff1609c6077ebac158c398c521742e888f830c"} Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.703000 4803 scope.go:117] "RemoveContainer" containerID="069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.703169 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8t6bf" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.729657 4803 scope.go:117] "RemoveContainer" containerID="6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.754850 4803 scope.go:117] "RemoveContainer" containerID="6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.807239 4803 scope.go:117] "RemoveContainer" containerID="069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b" Mar 20 18:23:25 crc kubenswrapper[4803]: E0320 18:23:25.808858 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b\": container with ID starting with 069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b not found: ID does not exist" containerID="069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.808899 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b"} err="failed to get container status \"069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b\": rpc error: code = NotFound desc = could not find container \"069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b\": container with ID starting with 069d0f33ae752c8801c9ccc58d2a291cb65d5bd14ea12bed3736f002b4a0660b not found: ID does not exist" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.808928 4803 scope.go:117] "RemoveContainer" containerID="6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542" Mar 20 18:23:25 crc kubenswrapper[4803]: E0320 18:23:25.809966 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542\": container with ID starting with 6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542 not found: ID does not exist" containerID="6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.809995 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542"} err="failed to get container status \"6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542\": rpc error: code = NotFound desc = could not find container \"6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542\": container with ID starting with 6178d89a0c22ad7e2abebbf590ecb214819d5c7dbb2c025c7b1683c6e78e1542 not found: ID does not exist" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.810015 4803 scope.go:117] "RemoveContainer" containerID="6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7" Mar 20 18:23:25 crc kubenswrapper[4803]: E0320 18:23:25.810422 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7\": container with ID starting with 6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7 not found: ID does not exist" containerID="6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.810449 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7"} err="failed to get container status \"6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7\": rpc error: code = NotFound desc = could not find container \"6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7\": container with ID starting with 6a8472d331f7b4b994dbd66de472c85a2d1cbc1c3e170fa6f41f38a5ad94c9f7 not found: ID does not exist" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.827479 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-catalog-content\") pod \"4c0383c9-c487-4fec-80b1-a10258f91a9c\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.827699 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-utilities\") pod \"4c0383c9-c487-4fec-80b1-a10258f91a9c\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.827878 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6hz4\" (UniqueName: \"kubernetes.io/projected/4c0383c9-c487-4fec-80b1-a10258f91a9c-kube-api-access-v6hz4\") pod \"4c0383c9-c487-4fec-80b1-a10258f91a9c\" (UID: \"4c0383c9-c487-4fec-80b1-a10258f91a9c\") " Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.829862 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-utilities" (OuterVolumeSpecName: "utilities") pod "4c0383c9-c487-4fec-80b1-a10258f91a9c" (UID: "4c0383c9-c487-4fec-80b1-a10258f91a9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.834276 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0383c9-c487-4fec-80b1-a10258f91a9c-kube-api-access-v6hz4" (OuterVolumeSpecName: "kube-api-access-v6hz4") pod "4c0383c9-c487-4fec-80b1-a10258f91a9c" (UID: "4c0383c9-c487-4fec-80b1-a10258f91a9c"). InnerVolumeSpecName "kube-api-access-v6hz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.929979 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.930231 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6hz4\" (UniqueName: \"kubernetes.io/projected/4c0383c9-c487-4fec-80b1-a10258f91a9c-kube-api-access-v6hz4\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:25 crc kubenswrapper[4803]: I0320 18:23:25.948872 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c0383c9-c487-4fec-80b1-a10258f91a9c" (UID: "4c0383c9-c487-4fec-80b1-a10258f91a9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:23:26 crc kubenswrapper[4803]: I0320 18:23:26.032019 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c0383c9-c487-4fec-80b1-a10258f91a9c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:26 crc kubenswrapper[4803]: I0320 18:23:26.035741 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8t6bf"] Mar 20 18:23:26 crc kubenswrapper[4803]: I0320 18:23:26.044719 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8t6bf"] Mar 20 18:23:26 crc kubenswrapper[4803]: I0320 18:23:26.857031 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" path="/var/lib/kubelet/pods/4c0383c9-c487-4fec-80b1-a10258f91a9c/volumes" Mar 20 18:23:28 crc kubenswrapper[4803]: I0320 18:23:28.031911 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:28 crc kubenswrapper[4803]: I0320 18:23:28.033138 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:28 crc kubenswrapper[4803]: I0320 18:23:28.083579 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:28 crc kubenswrapper[4803]: I0320 18:23:28.213676 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:28 crc kubenswrapper[4803]: I0320 18:23:28.213722 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:28 crc kubenswrapper[4803]: I0320 18:23:28.258992 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:29 crc kubenswrapper[4803]: I0320 18:23:29.781907 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:31 crc kubenswrapper[4803]: I0320 18:23:31.683032 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gtd94"] Mar 20 18:23:31 crc kubenswrapper[4803]: I0320 18:23:31.749215 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gtd94" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerName="registry-server" containerID="cri-o://90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8" gracePeriod=2 Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.229227 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.348819 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-catalog-content\") pod \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.348888 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-utilities\") pod \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.349005 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtbvm\" (UniqueName: \"kubernetes.io/projected/fdd8c6e2-09fd-4e27-b428-8d80976baefa-kube-api-access-rtbvm\") pod \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\" (UID: \"fdd8c6e2-09fd-4e27-b428-8d80976baefa\") " Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.350190 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-utilities" (OuterVolumeSpecName: "utilities") pod "fdd8c6e2-09fd-4e27-b428-8d80976baefa" (UID: "fdd8c6e2-09fd-4e27-b428-8d80976baefa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.360794 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdd8c6e2-09fd-4e27-b428-8d80976baefa-kube-api-access-rtbvm" (OuterVolumeSpecName: "kube-api-access-rtbvm") pod "fdd8c6e2-09fd-4e27-b428-8d80976baefa" (UID: "fdd8c6e2-09fd-4e27-b428-8d80976baefa"). InnerVolumeSpecName "kube-api-access-rtbvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.417980 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fdd8c6e2-09fd-4e27-b428-8d80976baefa" (UID: "fdd8c6e2-09fd-4e27-b428-8d80976baefa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.452210 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.452258 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fdd8c6e2-09fd-4e27-b428-8d80976baefa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.452278 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtbvm\" (UniqueName: \"kubernetes.io/projected/fdd8c6e2-09fd-4e27-b428-8d80976baefa-kube-api-access-rtbvm\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.760174 4803 generic.go:334] "Generic (PLEG): container finished" podID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerID="90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8" exitCode=0 Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.760204 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtd94" event={"ID":"fdd8c6e2-09fd-4e27-b428-8d80976baefa","Type":"ContainerDied","Data":"90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8"} Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.760241 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gtd94" event={"ID":"fdd8c6e2-09fd-4e27-b428-8d80976baefa","Type":"ContainerDied","Data":"9b09591e9466a772a52f8351e1d89050d0f6a400ce61a41f92477424f4b832b7"} Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.760258 4803 scope.go:117] "RemoveContainer" containerID="90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.760279 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gtd94" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.782034 4803 scope.go:117] "RemoveContainer" containerID="ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.816608 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gtd94"] Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.828493 4803 scope.go:117] "RemoveContainer" containerID="444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.836608 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gtd94"] Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.863688 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" path="/var/lib/kubelet/pods/fdd8c6e2-09fd-4e27-b428-8d80976baefa/volumes" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.886986 4803 scope.go:117] "RemoveContainer" containerID="90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8" Mar 20 18:23:32 crc kubenswrapper[4803]: E0320 18:23:32.887733 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8\": container with ID starting with 90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8 not found: ID does not exist" containerID="90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.887787 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8"} err="failed to get container status \"90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8\": rpc error: code = NotFound desc = could not find container \"90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8\": container with ID starting with 90ca34a3e01f2108c010d977aa508a963d4faede998c121c4e67cdb92cdd4ce8 not found: ID does not exist" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.887820 4803 scope.go:117] "RemoveContainer" containerID="ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4" Mar 20 18:23:32 crc kubenswrapper[4803]: E0320 18:23:32.888152 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4\": container with ID starting with ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4 not found: ID does not exist" containerID="ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.888176 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4"} err="failed to get container status \"ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4\": rpc error: code = NotFound desc = could not find container \"ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4\": container with ID starting with ab0ed5aaad90a5e3e8716695c48be80de25f0bc795d14babe5182361d80ec3d4 not found: ID does not exist" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.888194 4803 scope.go:117] "RemoveContainer" containerID="444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de" Mar 20 18:23:32 crc kubenswrapper[4803]: E0320 18:23:32.888356 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de\": container with ID starting with 444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de not found: ID does not exist" containerID="444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de" Mar 20 18:23:32 crc kubenswrapper[4803]: I0320 18:23:32.888377 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de"} err="failed to get container status \"444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de\": rpc error: code = NotFound desc = could not find container \"444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de\": container with ID starting with 444d57d3ea3fa8ac90ce95fa26ddedd7173277e7b6b462cf98ba53c47685d3de not found: ID does not exist" Mar 20 18:23:38 crc kubenswrapper[4803]: I0320 18:23:38.255001 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:38 crc kubenswrapper[4803]: I0320 18:23:38.333995 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdc7v"] Mar 20 18:23:38 crc kubenswrapper[4803]: I0320 18:23:38.816843 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zdc7v" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" containerName="registry-server" containerID="cri-o://98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296" gracePeriod=2 Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.301175 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.483494 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-utilities\") pod \"46527924-296c-4c95-a60a-42bf5f170ed3\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.483910 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd6gx\" (UniqueName: \"kubernetes.io/projected/46527924-296c-4c95-a60a-42bf5f170ed3-kube-api-access-vd6gx\") pod \"46527924-296c-4c95-a60a-42bf5f170ed3\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.484007 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-catalog-content\") pod \"46527924-296c-4c95-a60a-42bf5f170ed3\" (UID: \"46527924-296c-4c95-a60a-42bf5f170ed3\") " Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.484398 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-utilities" (OuterVolumeSpecName: "utilities") pod "46527924-296c-4c95-a60a-42bf5f170ed3" (UID: "46527924-296c-4c95-a60a-42bf5f170ed3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.484748 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.494689 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46527924-296c-4c95-a60a-42bf5f170ed3-kube-api-access-vd6gx" (OuterVolumeSpecName: "kube-api-access-vd6gx") pod "46527924-296c-4c95-a60a-42bf5f170ed3" (UID: "46527924-296c-4c95-a60a-42bf5f170ed3"). InnerVolumeSpecName "kube-api-access-vd6gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.513434 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46527924-296c-4c95-a60a-42bf5f170ed3" (UID: "46527924-296c-4c95-a60a-42bf5f170ed3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.586128 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd6gx\" (UniqueName: \"kubernetes.io/projected/46527924-296c-4c95-a60a-42bf5f170ed3-kube-api-access-vd6gx\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.586159 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46527924-296c-4c95-a60a-42bf5f170ed3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.827850 4803 generic.go:334] "Generic (PLEG): container finished" podID="46527924-296c-4c95-a60a-42bf5f170ed3" containerID="98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296" exitCode=0 Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.827899 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdc7v" event={"ID":"46527924-296c-4c95-a60a-42bf5f170ed3","Type":"ContainerDied","Data":"98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296"} Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.827908 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zdc7v" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.827929 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zdc7v" event={"ID":"46527924-296c-4c95-a60a-42bf5f170ed3","Type":"ContainerDied","Data":"81ef9c5df9b653828fa2282f0a9ea5ba490bfe13424c44556f238c01a6c7cf51"} Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.827949 4803 scope.go:117] "RemoveContainer" containerID="98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.857410 4803 scope.go:117] "RemoveContainer" containerID="864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.865835 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdc7v"] Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.876564 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zdc7v"] Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.882102 4803 scope.go:117] "RemoveContainer" containerID="aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.931841 4803 scope.go:117] "RemoveContainer" containerID="98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296" Mar 20 18:23:39 crc kubenswrapper[4803]: E0320 18:23:39.932305 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296\": container with ID starting with 98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296 not found: ID does not exist" containerID="98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.932356 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296"} err="failed to get container status \"98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296\": rpc error: code = NotFound desc = could not find container \"98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296\": container with ID starting with 98cfaf5a65506fe3936d4e2061b182da47d3ce805e7bbba7cd91cd0cb1d85296 not found: ID does not exist" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.932389 4803 scope.go:117] "RemoveContainer" containerID="864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54" Mar 20 18:23:39 crc kubenswrapper[4803]: E0320 18:23:39.932810 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54\": container with ID starting with 864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54 not found: ID does not exist" containerID="864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.932849 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54"} err="failed to get container status \"864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54\": rpc error: code = NotFound desc = could not find container \"864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54\": container with ID starting with 864f9edcc6028569f1e43207e123384d139f06ce34ce55b405cbf90ef4bebe54 not found: ID does not exist" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.932876 4803 scope.go:117] "RemoveContainer" containerID="aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713" Mar 20 18:23:39 crc kubenswrapper[4803]: E0320 18:23:39.933139 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713\": container with ID starting with aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713 not found: ID does not exist" containerID="aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713" Mar 20 18:23:39 crc kubenswrapper[4803]: I0320 18:23:39.933165 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713"} err="failed to get container status \"aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713\": rpc error: code = NotFound desc = could not find container \"aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713\": container with ID starting with aac7c38523950a714c62983dce9212181f4e0887c0a70dfebd794b896fc1d713 not found: ID does not exist" Mar 20 18:23:40 crc kubenswrapper[4803]: I0320 18:23:40.859964 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" path="/var/lib/kubelet/pods/46527924-296c-4c95-a60a-42bf5f170ed3/volumes" Mar 20 18:23:41 crc kubenswrapper[4803]: I0320 18:23:41.564867 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-k5nvk_f13296fc-7b19-43e5-9f80-08502dee6f1b/control-plane-machine-set-operator/0.log" Mar 20 18:23:41 crc kubenswrapper[4803]: I0320 18:23:41.748796 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ph6zk_f3f47d35-b096-47cb-879d-05004b9cbcf4/kube-rbac-proxy/0.log" Mar 20 18:23:41 crc kubenswrapper[4803]: I0320 18:23:41.841992 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ph6zk_f3f47d35-b096-47cb-879d-05004b9cbcf4/machine-api-operator/0.log" Mar 20 18:23:53 crc kubenswrapper[4803]: I0320 18:23:53.975850 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-dt62l_06da84ce-5bd4-4e75-ac2d-eda831724e58/cert-manager-controller/0.log" Mar 20 18:23:54 crc kubenswrapper[4803]: I0320 18:23:54.167629 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-wfvp8_4d10f699-d096-4cc0-bd7d-1afc806ede10/cert-manager-cainjector/0.log" Mar 20 18:23:54 crc kubenswrapper[4803]: I0320 18:23:54.196834 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-gq2dl_ea5d9398-4412-4a8c-a015-42ec0733de0c/cert-manager-webhook/0.log" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.606047 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-74k2l"] Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.607727 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerName="extract-content" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.607805 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerName="extract-content" Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.607869 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="extract-content" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.607930 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="extract-content" Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.608000 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerName="extract-utilities" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.608055 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerName="extract-utilities" Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.608117 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" containerName="extract-content" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.608171 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" containerName="extract-content" Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.608228 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.608281 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.608356 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="extract-utilities" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.608415 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="extract-utilities" Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.608477 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.608931 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.609032 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" containerName="extract-utilities" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.609091 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" containerName="extract-utilities" Mar 20 18:23:58 crc kubenswrapper[4803]: E0320 18:23:58.609153 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.609212 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.609442 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdd8c6e2-09fd-4e27-b428-8d80976baefa" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.609516 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="46527924-296c-4c95-a60a-42bf5f170ed3" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.609627 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0383c9-c487-4fec-80b1-a10258f91a9c" containerName="registry-server" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.610939 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.628455 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-74k2l"] Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.673692 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjwd5\" (UniqueName: \"kubernetes.io/projected/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-kube-api-access-zjwd5\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.673809 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-catalog-content\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.673851 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-utilities\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.775944 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjwd5\" (UniqueName: \"kubernetes.io/projected/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-kube-api-access-zjwd5\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.776021 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-catalog-content\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.776049 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-utilities\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.777000 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-utilities\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.777003 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-catalog-content\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.795158 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjwd5\" (UniqueName: \"kubernetes.io/projected/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-kube-api-access-zjwd5\") pod \"community-operators-74k2l\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:58 crc kubenswrapper[4803]: I0320 18:23:58.931847 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:23:59 crc kubenswrapper[4803]: I0320 18:23:59.465015 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-74k2l"] Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.029295 4803 generic.go:334] "Generic (PLEG): container finished" podID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerID="8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d" exitCode=0 Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.029410 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74k2l" event={"ID":"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d","Type":"ContainerDied","Data":"8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d"} Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.029753 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74k2l" event={"ID":"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d","Type":"ContainerStarted","Data":"5c93107d889629a475b609b85e8670c26977bc4303d2bed5d22041dcd9f0a259"} Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.141913 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567184-5jg5z"] Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.143732 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-5jg5z" Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.145927 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.147245 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.147420 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.155841 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-5jg5z"] Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.202033 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9ljt\" (UniqueName: \"kubernetes.io/projected/637b5048-6215-47ef-ac6a-454784740aab-kube-api-access-m9ljt\") pod \"auto-csr-approver-29567184-5jg5z\" (UID: \"637b5048-6215-47ef-ac6a-454784740aab\") " pod="openshift-infra/auto-csr-approver-29567184-5jg5z" Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.303904 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9ljt\" (UniqueName: \"kubernetes.io/projected/637b5048-6215-47ef-ac6a-454784740aab-kube-api-access-m9ljt\") pod \"auto-csr-approver-29567184-5jg5z\" (UID: \"637b5048-6215-47ef-ac6a-454784740aab\") " pod="openshift-infra/auto-csr-approver-29567184-5jg5z" Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.323344 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9ljt\" (UniqueName: \"kubernetes.io/projected/637b5048-6215-47ef-ac6a-454784740aab-kube-api-access-m9ljt\") pod \"auto-csr-approver-29567184-5jg5z\" (UID: \"637b5048-6215-47ef-ac6a-454784740aab\") " pod="openshift-infra/auto-csr-approver-29567184-5jg5z" Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.463437 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-5jg5z" Mar 20 18:24:00 crc kubenswrapper[4803]: I0320 18:24:00.914176 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-5jg5z"] Mar 20 18:24:01 crc kubenswrapper[4803]: I0320 18:24:01.041070 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-5jg5z" event={"ID":"637b5048-6215-47ef-ac6a-454784740aab","Type":"ContainerStarted","Data":"a0217465a8a2b7d7043105de96c1c9062db09881d631147a883bef658d192370"} Mar 20 18:24:02 crc kubenswrapper[4803]: I0320 18:24:02.055839 4803 generic.go:334] "Generic (PLEG): container finished" podID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerID="63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5" exitCode=0 Mar 20 18:24:02 crc kubenswrapper[4803]: I0320 18:24:02.055947 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74k2l" event={"ID":"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d","Type":"ContainerDied","Data":"63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5"} Mar 20 18:24:03 crc kubenswrapper[4803]: I0320 18:24:03.070897 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74k2l" event={"ID":"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d","Type":"ContainerStarted","Data":"8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16"} Mar 20 18:24:03 crc kubenswrapper[4803]: I0320 18:24:03.075892 4803 generic.go:334] "Generic (PLEG): container finished" podID="637b5048-6215-47ef-ac6a-454784740aab" containerID="51a4eb56d38cc431da6059efc9ef0f6b2bbd9cd3ace53dd15ccfefa6630ffe23" exitCode=0 Mar 20 18:24:03 crc kubenswrapper[4803]: I0320 18:24:03.075948 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-5jg5z" event={"ID":"637b5048-6215-47ef-ac6a-454784740aab","Type":"ContainerDied","Data":"51a4eb56d38cc431da6059efc9ef0f6b2bbd9cd3ace53dd15ccfefa6630ffe23"} Mar 20 18:24:03 crc kubenswrapper[4803]: I0320 18:24:03.101355 4803 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-74k2l" podStartSLOduration=2.659475681 podStartE2EDuration="5.101339911s" podCreationTimestamp="2026-03-20 18:23:58 +0000 UTC" firstStartedPulling="2026-03-20 18:24:00.030814931 +0000 UTC m=+4049.942407011" lastFinishedPulling="2026-03-20 18:24:02.472679171 +0000 UTC m=+4052.384271241" observedRunningTime="2026-03-20 18:24:03.091912345 +0000 UTC m=+4053.003504435" watchObservedRunningTime="2026-03-20 18:24:03.101339911 +0000 UTC m=+4053.012931981" Mar 20 18:24:04 crc kubenswrapper[4803]: I0320 18:24:04.420877 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-5jg5z" Mar 20 18:24:04 crc kubenswrapper[4803]: I0320 18:24:04.475991 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9ljt\" (UniqueName: \"kubernetes.io/projected/637b5048-6215-47ef-ac6a-454784740aab-kube-api-access-m9ljt\") pod \"637b5048-6215-47ef-ac6a-454784740aab\" (UID: \"637b5048-6215-47ef-ac6a-454784740aab\") " Mar 20 18:24:04 crc kubenswrapper[4803]: I0320 18:24:04.486798 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637b5048-6215-47ef-ac6a-454784740aab-kube-api-access-m9ljt" (OuterVolumeSpecName: "kube-api-access-m9ljt") pod "637b5048-6215-47ef-ac6a-454784740aab" (UID: "637b5048-6215-47ef-ac6a-454784740aab"). InnerVolumeSpecName "kube-api-access-m9ljt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:24:04 crc kubenswrapper[4803]: I0320 18:24:04.578721 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9ljt\" (UniqueName: \"kubernetes.io/projected/637b5048-6215-47ef-ac6a-454784740aab-kube-api-access-m9ljt\") on node \"crc\" DevicePath \"\"" Mar 20 18:24:05 crc kubenswrapper[4803]: I0320 18:24:05.094770 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567184-5jg5z" event={"ID":"637b5048-6215-47ef-ac6a-454784740aab","Type":"ContainerDied","Data":"a0217465a8a2b7d7043105de96c1c9062db09881d631147a883bef658d192370"} Mar 20 18:24:05 crc kubenswrapper[4803]: I0320 18:24:05.094814 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0217465a8a2b7d7043105de96c1c9062db09881d631147a883bef658d192370" Mar 20 18:24:05 crc kubenswrapper[4803]: I0320 18:24:05.094836 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567184-5jg5z" Mar 20 18:24:05 crc kubenswrapper[4803]: I0320 18:24:05.497045 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-8pkkx"] Mar 20 18:24:05 crc kubenswrapper[4803]: I0320 18:24:05.506450 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567178-8pkkx"] Mar 20 18:24:06 crc kubenswrapper[4803]: I0320 18:24:06.859516 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd207e9b-7b34-48bd-9f9d-5201793fa6f3" path="/var/lib/kubelet/pods/dd207e9b-7b34-48bd-9f9d-5201793fa6f3/volumes" Mar 20 18:24:06 crc kubenswrapper[4803]: I0320 18:24:06.968177 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-2nnmb_38d598c3-b9e5-4404-abb4-da1e9354e157/nmstate-console-plugin/0.log" Mar 20 18:24:07 crc kubenswrapper[4803]: I0320 18:24:07.130542 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-stkpl_7130990e-c3d3-48fc-99a3-31f225ec19ee/nmstate-handler/0.log" Mar 20 18:24:07 crc kubenswrapper[4803]: I0320 18:24:07.216410 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-fjk5k_8b0bc609-411b-43cc-b7cf-a88f669b2d44/kube-rbac-proxy/0.log" Mar 20 18:24:07 crc kubenswrapper[4803]: I0320 18:24:07.258514 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-fjk5k_8b0bc609-411b-43cc-b7cf-a88f669b2d44/nmstate-metrics/0.log" Mar 20 18:24:07 crc kubenswrapper[4803]: I0320 18:24:07.355566 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-xgfwl_f2fb97b5-40f7-443d-8680-95e112804031/nmstate-operator/0.log" Mar 20 18:24:07 crc kubenswrapper[4803]: I0320 18:24:07.430903 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-krj4f_0bf484cc-0fe2-4cb0-99c2-0714910012ca/nmstate-webhook/0.log" Mar 20 18:24:08 crc kubenswrapper[4803]: I0320 18:24:08.932811 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:24:08 crc kubenswrapper[4803]: I0320 18:24:08.933098 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:24:08 crc kubenswrapper[4803]: I0320 18:24:08.989112 4803 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:24:09 crc kubenswrapper[4803]: I0320 18:24:09.170331 4803 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:24:09 crc kubenswrapper[4803]: I0320 18:24:09.224365 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-74k2l"] Mar 20 18:24:11 crc kubenswrapper[4803]: I0320 18:24:11.146569 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-74k2l" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerName="registry-server" containerID="cri-o://8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16" gracePeriod=2 Mar 20 18:24:11 crc kubenswrapper[4803]: I0320 18:24:11.644903 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:24:11 crc kubenswrapper[4803]: I0320 18:24:11.800947 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjwd5\" (UniqueName: \"kubernetes.io/projected/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-kube-api-access-zjwd5\") pod \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " Mar 20 18:24:11 crc kubenswrapper[4803]: I0320 18:24:11.801056 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-catalog-content\") pod \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " Mar 20 18:24:11 crc kubenswrapper[4803]: I0320 18:24:11.801168 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-utilities\") pod \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\" (UID: \"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d\") " Mar 20 18:24:11 crc kubenswrapper[4803]: I0320 18:24:11.808364 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-utilities" (OuterVolumeSpecName: "utilities") pod "fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" (UID: "fbdbf7af-b7be-4aba-a48c-3af9a44aed8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:24:11 crc kubenswrapper[4803]: I0320 18:24:11.903841 4803 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.083756 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" (UID: "fbdbf7af-b7be-4aba-a48c-3af9a44aed8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.107739 4803 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.159055 4803 generic.go:334] "Generic (PLEG): container finished" podID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerID="8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16" exitCode=0 Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.159110 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74k2l" event={"ID":"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d","Type":"ContainerDied","Data":"8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16"} Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.159145 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74k2l" event={"ID":"fbdbf7af-b7be-4aba-a48c-3af9a44aed8d","Type":"ContainerDied","Data":"5c93107d889629a475b609b85e8670c26977bc4303d2bed5d22041dcd9f0a259"} Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.159169 4803 scope.go:117] "RemoveContainer" containerID="8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.159361 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74k2l" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.193383 4803 scope.go:117] "RemoveContainer" containerID="63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.276099 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-kube-api-access-zjwd5" (OuterVolumeSpecName: "kube-api-access-zjwd5") pod "fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" (UID: "fbdbf7af-b7be-4aba-a48c-3af9a44aed8d"). InnerVolumeSpecName "kube-api-access-zjwd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.293323 4803 scope.go:117] "RemoveContainer" containerID="8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.311977 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjwd5\" (UniqueName: \"kubernetes.io/projected/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d-kube-api-access-zjwd5\") on node \"crc\" DevicePath \"\"" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.397110 4803 scope.go:117] "RemoveContainer" containerID="8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16" Mar 20 18:24:12 crc kubenswrapper[4803]: E0320 18:24:12.397881 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16\": container with ID starting with 8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16 not found: ID does not exist" containerID="8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.397962 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16"} err="failed to get container status \"8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16\": rpc error: code = NotFound desc = could not find container \"8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16\": container with ID starting with 8b9719798b5f1dc5957ba24ca48f508a6d1938b69b280daadc6756982323bf16 not found: ID does not exist" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.398022 4803 scope.go:117] "RemoveContainer" containerID="63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5" Mar 20 18:24:12 crc kubenswrapper[4803]: E0320 18:24:12.398575 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5\": container with ID starting with 63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5 not found: ID does not exist" containerID="63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.398607 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5"} err="failed to get container status \"63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5\": rpc error: code = NotFound desc = could not find container \"63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5\": container with ID starting with 63fdb969b94823c07782c28684ada620787d932857e60378ca1c73bddd428fc5 not found: ID does not exist" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.398629 4803 scope.go:117] "RemoveContainer" containerID="8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d" Mar 20 18:24:12 crc kubenswrapper[4803]: E0320 18:24:12.399095 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d\": container with ID starting with 8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d not found: ID does not exist" containerID="8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.399175 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d"} err="failed to get container status \"8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d\": rpc error: code = NotFound desc = could not find container \"8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d\": container with ID starting with 8fe0325bc4ff0baed078767c1632ee18a34bf115dbee077c9c78721ce23ec76d not found: ID does not exist" Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.506362 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-74k2l"] Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.518916 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-74k2l"] Mar 20 18:24:12 crc kubenswrapper[4803]: I0320 18:24:12.857791 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" path="/var/lib/kubelet/pods/fbdbf7af-b7be-4aba-a48c-3af9a44aed8d/volumes" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.121689 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-24x6b_06c64fc0-e716-455b-bff4-0aac055505a9/kube-rbac-proxy/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.303510 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-24x6b_06c64fc0-e716-455b-bff4-0aac055505a9/controller/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.385436 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-6nwbj_3181265c-ba7e-4f29-9950-bfefd81e98e5/frr-k8s-webhook-server/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.485654 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-frr-files/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.643413 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-frr-files/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.654786 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-reloader/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.678989 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-reloader/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.704411 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-metrics/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.902590 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-frr-files/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.909908 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-reloader/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.915252 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-metrics/0.log" Mar 20 18:24:36 crc kubenswrapper[4803]: I0320 18:24:36.947230 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-metrics/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.044355 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-frr-files/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.073286 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-metrics/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.073759 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/cp-reloader/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.118836 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/controller/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.217743 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/kube-rbac-proxy/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.247146 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/frr-metrics/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.322034 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/kube-rbac-proxy-frr/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.465789 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/reloader/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.524308 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-599d9f9c9-jbh6h_656f1985-be0a-4447-a03f-2ec4d11727c2/manager/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.710119 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66cfbc7d76-5tcqn_f939090a-abec-48ac-9e06-10175ff02c71/webhook-server/0.log" Mar 20 18:24:37 crc kubenswrapper[4803]: I0320 18:24:37.876002 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qjwcn_6b632859-081b-4be0-a3f6-9b91b4687ecf/kube-rbac-proxy/0.log" Mar 20 18:24:38 crc kubenswrapper[4803]: I0320 18:24:38.245283 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:24:38 crc kubenswrapper[4803]: I0320 18:24:38.245557 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:24:38 crc kubenswrapper[4803]: I0320 18:24:38.271080 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qjwcn_6b632859-081b-4be0-a3f6-9b91b4687ecf/speaker/0.log" Mar 20 18:24:38 crc kubenswrapper[4803]: I0320 18:24:38.991747 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wkwlf_4367c661-d351-46c9-9a1f-87f039fe6458/frr/0.log" Mar 20 18:24:41 crc kubenswrapper[4803]: I0320 18:24:41.488570 4803 scope.go:117] "RemoveContainer" containerID="02d1d84476808c7878973b7c431017cb412506e64972747ba7fc5e325378c7f0" Mar 20 18:24:51 crc kubenswrapper[4803]: I0320 18:24:51.839929 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/util/0.log" Mar 20 18:24:52 crc kubenswrapper[4803]: I0320 18:24:52.564537 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/pull/0.log" Mar 20 18:24:52 crc kubenswrapper[4803]: I0320 18:24:52.583813 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/pull/0.log" Mar 20 18:24:52 crc kubenswrapper[4803]: I0320 18:24:52.606657 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/util/0.log" Mar 20 18:24:52 crc kubenswrapper[4803]: I0320 18:24:52.796820 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/util/0.log" Mar 20 18:24:52 crc kubenswrapper[4803]: I0320 18:24:52.804832 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/pull/0.log" Mar 20 18:24:52 crc kubenswrapper[4803]: I0320 18:24:52.850570 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874l9j9x_717f35f6-140a-4071-8f5a-ce0e166c79bb/extract/0.log" Mar 20 18:24:52 crc kubenswrapper[4803]: I0320 18:24:52.967960 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/util/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.184137 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/pull/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.189999 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/util/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.228728 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/pull/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.555075 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/pull/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.599297 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/extract/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.604637 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c15ppjc_48f49a52-830d-4717-820b-9c214238244d/util/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.766849 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-utilities/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.943262 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-utilities/0.log" Mar 20 18:24:53 crc kubenswrapper[4803]: I0320 18:24:53.988899 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-content/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.004439 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-content/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.190001 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-content/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.202481 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/extract-utilities/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.405559 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-utilities/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.478995 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q87mb_67f8c88e-3bdc-403b-8ed0-8c1c2cab9d6c/registry-server/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.580146 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-content/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.633139 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-utilities/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.667262 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-content/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.837721 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-utilities/0.log" Mar 20 18:24:54 crc kubenswrapper[4803]: I0320 18:24:54.851114 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/extract-content/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.034026 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-s92b2_617ba1c8-b42d-4e9c-8d5b-6a903f267358/marketplace-operator/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.174501 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-utilities/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.324904 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2hjcd_ff59d6d7-cbbe-4245-9dc5-6bf6c41c5f98/registry-server/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.400781 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-utilities/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.416731 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-content/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.430377 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-content/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.599300 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-utilities/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.643763 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/extract-content/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.794304 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vsx7r_66687ffe-614d-4427-a236-13b8623bbd4c/registry-server/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.798946 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-utilities/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.958077 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-content/0.log" Mar 20 18:24:55 crc kubenswrapper[4803]: I0320 18:24:55.959769 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-utilities/0.log" Mar 20 18:24:56 crc kubenswrapper[4803]: I0320 18:24:56.004676 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-content/0.log" Mar 20 18:24:56 crc kubenswrapper[4803]: I0320 18:24:56.159158 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-utilities/0.log" Mar 20 18:24:56 crc kubenswrapper[4803]: I0320 18:24:56.249012 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/extract-content/0.log" Mar 20 18:24:56 crc kubenswrapper[4803]: I0320 18:24:56.592331 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xnbtp_3c920780-df66-4654-a41c-b178a85a7885/registry-server/0.log" Mar 20 18:25:08 crc kubenswrapper[4803]: I0320 18:25:08.245650 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:25:08 crc kubenswrapper[4803]: I0320 18:25:08.246132 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:25:38 crc kubenswrapper[4803]: I0320 18:25:38.246107 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:25:38 crc kubenswrapper[4803]: I0320 18:25:38.246582 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:25:38 crc kubenswrapper[4803]: I0320 18:25:38.246627 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 18:25:38 crc kubenswrapper[4803]: I0320 18:25:38.247097 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fdd16f5246edd6c5e9048a1a65ae17cc95dfdb32f5019a55bd0a73d20083db4"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:25:38 crc kubenswrapper[4803]: I0320 18:25:38.247148 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://2fdd16f5246edd6c5e9048a1a65ae17cc95dfdb32f5019a55bd0a73d20083db4" gracePeriod=600 Mar 20 18:25:38 crc kubenswrapper[4803]: I0320 18:25:38.889280 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="2fdd16f5246edd6c5e9048a1a65ae17cc95dfdb32f5019a55bd0a73d20083db4" exitCode=0 Mar 20 18:25:38 crc kubenswrapper[4803]: I0320 18:25:38.889629 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"2fdd16f5246edd6c5e9048a1a65ae17cc95dfdb32f5019a55bd0a73d20083db4"} Mar 20 18:25:38 crc kubenswrapper[4803]: I0320 18:25:38.889680 4803 scope.go:117] "RemoveContainer" containerID="b25b9b8ee563e1b8c318b55995d8ae42cca7655b4772a199316c483bd2c9497f" Mar 20 18:25:39 crc kubenswrapper[4803]: I0320 18:25:39.900358 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerStarted","Data":"08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789"} Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.156962 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567186-prv84"] Mar 20 18:26:00 crc kubenswrapper[4803]: E0320 18:26:00.157869 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.157884 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerName="extract-content" Mar 20 18:26:00 crc kubenswrapper[4803]: E0320 18:26:00.157916 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.157924 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4803]: E0320 18:26:00.157940 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637b5048-6215-47ef-ac6a-454784740aab" containerName="oc" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.157948 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="637b5048-6215-47ef-ac6a-454784740aab" containerName="oc" Mar 20 18:26:00 crc kubenswrapper[4803]: E0320 18:26:00.157971 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.157978 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerName="extract-utilities" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.158172 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="637b5048-6215-47ef-ac6a-454784740aab" containerName="oc" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.158190 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbdbf7af-b7be-4aba-a48c-3af9a44aed8d" containerName="registry-server" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.158957 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-prv84" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.162537 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.162774 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.162884 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.183026 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-prv84"] Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.302958 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55457\" (UniqueName: \"kubernetes.io/projected/d701028b-b006-498e-98eb-cb8b73759e6f-kube-api-access-55457\") pod \"auto-csr-approver-29567186-prv84\" (UID: \"d701028b-b006-498e-98eb-cb8b73759e6f\") " pod="openshift-infra/auto-csr-approver-29567186-prv84" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.405005 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55457\" (UniqueName: \"kubernetes.io/projected/d701028b-b006-498e-98eb-cb8b73759e6f-kube-api-access-55457\") pod \"auto-csr-approver-29567186-prv84\" (UID: \"d701028b-b006-498e-98eb-cb8b73759e6f\") " pod="openshift-infra/auto-csr-approver-29567186-prv84" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.438999 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55457\" (UniqueName: \"kubernetes.io/projected/d701028b-b006-498e-98eb-cb8b73759e6f-kube-api-access-55457\") pod \"auto-csr-approver-29567186-prv84\" (UID: \"d701028b-b006-498e-98eb-cb8b73759e6f\") " pod="openshift-infra/auto-csr-approver-29567186-prv84" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.497578 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-prv84" Mar 20 18:26:00 crc kubenswrapper[4803]: I0320 18:26:00.959593 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567186-prv84"] Mar 20 18:26:01 crc kubenswrapper[4803]: I0320 18:26:01.535669 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-prv84" event={"ID":"d701028b-b006-498e-98eb-cb8b73759e6f","Type":"ContainerStarted","Data":"dd075b5a9013c29733fca6b817b4f788b2018a85fa3e1c111c3bd4c9106c6908"} Mar 20 18:26:02 crc kubenswrapper[4803]: I0320 18:26:02.544082 4803 generic.go:334] "Generic (PLEG): container finished" podID="d701028b-b006-498e-98eb-cb8b73759e6f" containerID="86fba219e43ba41e040d4eb41c412aef6279ac66ead60b344246100c52531bbc" exitCode=0 Mar 20 18:26:02 crc kubenswrapper[4803]: I0320 18:26:02.544121 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-prv84" event={"ID":"d701028b-b006-498e-98eb-cb8b73759e6f","Type":"ContainerDied","Data":"86fba219e43ba41e040d4eb41c412aef6279ac66ead60b344246100c52531bbc"} Mar 20 18:26:03 crc kubenswrapper[4803]: I0320 18:26:03.944265 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-prv84" Mar 20 18:26:04 crc kubenswrapper[4803]: I0320 18:26:04.098813 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55457\" (UniqueName: \"kubernetes.io/projected/d701028b-b006-498e-98eb-cb8b73759e6f-kube-api-access-55457\") pod \"d701028b-b006-498e-98eb-cb8b73759e6f\" (UID: \"d701028b-b006-498e-98eb-cb8b73759e6f\") " Mar 20 18:26:04 crc kubenswrapper[4803]: I0320 18:26:04.110919 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d701028b-b006-498e-98eb-cb8b73759e6f-kube-api-access-55457" (OuterVolumeSpecName: "kube-api-access-55457") pod "d701028b-b006-498e-98eb-cb8b73759e6f" (UID: "d701028b-b006-498e-98eb-cb8b73759e6f"). InnerVolumeSpecName "kube-api-access-55457". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:26:04 crc kubenswrapper[4803]: I0320 18:26:04.201004 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55457\" (UniqueName: \"kubernetes.io/projected/d701028b-b006-498e-98eb-cb8b73759e6f-kube-api-access-55457\") on node \"crc\" DevicePath \"\"" Mar 20 18:26:04 crc kubenswrapper[4803]: I0320 18:26:04.565982 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567186-prv84" event={"ID":"d701028b-b006-498e-98eb-cb8b73759e6f","Type":"ContainerDied","Data":"dd075b5a9013c29733fca6b817b4f788b2018a85fa3e1c111c3bd4c9106c6908"} Mar 20 18:26:04 crc kubenswrapper[4803]: I0320 18:26:04.566650 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd075b5a9013c29733fca6b817b4f788b2018a85fa3e1c111c3bd4c9106c6908" Mar 20 18:26:04 crc kubenswrapper[4803]: I0320 18:26:04.566013 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567186-prv84" Mar 20 18:26:05 crc kubenswrapper[4803]: I0320 18:26:05.024158 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-l5d75"] Mar 20 18:26:05 crc kubenswrapper[4803]: I0320 18:26:05.034620 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567180-l5d75"] Mar 20 18:26:06 crc kubenswrapper[4803]: I0320 18:26:06.882063 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8217899f-3d1a-40be-871d-2315f7b6878b" path="/var/lib/kubelet/pods/8217899f-3d1a-40be-871d-2315f7b6878b/volumes" Mar 20 18:26:41 crc kubenswrapper[4803]: I0320 18:26:41.819490 4803 scope.go:117] "RemoveContainer" containerID="13e0d63cc3019f0660afec642b53963b8ac60a69c3f6a66f8b77f7d421e4aad4" Mar 20 18:26:50 crc kubenswrapper[4803]: I0320 18:26:50.148592 4803 generic.go:334] "Generic (PLEG): container finished" podID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerID="398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d" exitCode=0 Mar 20 18:26:50 crc kubenswrapper[4803]: I0320 18:26:50.148696 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sclvs/must-gather-tpvnk" event={"ID":"11de5097-05f5-47c5-a17f-cb331c42fa58","Type":"ContainerDied","Data":"398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d"} Mar 20 18:26:50 crc kubenswrapper[4803]: I0320 18:26:50.150087 4803 scope.go:117] "RemoveContainer" containerID="398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d" Mar 20 18:26:50 crc kubenswrapper[4803]: I0320 18:26:50.474547 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sclvs_must-gather-tpvnk_11de5097-05f5-47c5-a17f-cb331c42fa58/gather/0.log" Mar 20 18:27:01 crc kubenswrapper[4803]: I0320 18:27:01.596855 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sclvs/must-gather-tpvnk"] Mar 20 18:27:01 crc kubenswrapper[4803]: I0320 18:27:01.597819 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-sclvs/must-gather-tpvnk" podUID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerName="copy" containerID="cri-o://21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb" gracePeriod=2 Mar 20 18:27:01 crc kubenswrapper[4803]: I0320 18:27:01.607662 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sclvs/must-gather-tpvnk"] Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.034355 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sclvs_must-gather-tpvnk_11de5097-05f5-47c5-a17f-cb331c42fa58/copy/0.log" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.035069 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.107477 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72x5v\" (UniqueName: \"kubernetes.io/projected/11de5097-05f5-47c5-a17f-cb331c42fa58-kube-api-access-72x5v\") pod \"11de5097-05f5-47c5-a17f-cb331c42fa58\" (UID: \"11de5097-05f5-47c5-a17f-cb331c42fa58\") " Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.107595 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11de5097-05f5-47c5-a17f-cb331c42fa58-must-gather-output\") pod \"11de5097-05f5-47c5-a17f-cb331c42fa58\" (UID: \"11de5097-05f5-47c5-a17f-cb331c42fa58\") " Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.113886 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11de5097-05f5-47c5-a17f-cb331c42fa58-kube-api-access-72x5v" (OuterVolumeSpecName: "kube-api-access-72x5v") pod "11de5097-05f5-47c5-a17f-cb331c42fa58" (UID: "11de5097-05f5-47c5-a17f-cb331c42fa58"). InnerVolumeSpecName "kube-api-access-72x5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.209355 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72x5v\" (UniqueName: \"kubernetes.io/projected/11de5097-05f5-47c5-a17f-cb331c42fa58-kube-api-access-72x5v\") on node \"crc\" DevicePath \"\"" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.267314 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11de5097-05f5-47c5-a17f-cb331c42fa58-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "11de5097-05f5-47c5-a17f-cb331c42fa58" (UID: "11de5097-05f5-47c5-a17f-cb331c42fa58"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.311271 4803 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/11de5097-05f5-47c5-a17f-cb331c42fa58-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.616900 4803 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sclvs_must-gather-tpvnk_11de5097-05f5-47c5-a17f-cb331c42fa58/copy/0.log" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.618828 4803 generic.go:334] "Generic (PLEG): container finished" podID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerID="21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb" exitCode=143 Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.619081 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sclvs/must-gather-tpvnk" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.623581 4803 scope.go:117] "RemoveContainer" containerID="21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.664624 4803 scope.go:117] "RemoveContainer" containerID="398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.747598 4803 scope.go:117] "RemoveContainer" containerID="21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb" Mar 20 18:27:02 crc kubenswrapper[4803]: E0320 18:27:02.748063 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb\": container with ID starting with 21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb not found: ID does not exist" containerID="21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.748114 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb"} err="failed to get container status \"21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb\": rpc error: code = NotFound desc = could not find container \"21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb\": container with ID starting with 21dfad7ef643370e092c430c5bb8e2cc27cd781aa2584c866dc22c4215a2c6cb not found: ID does not exist" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.748140 4803 scope.go:117] "RemoveContainer" containerID="398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d" Mar 20 18:27:02 crc kubenswrapper[4803]: E0320 18:27:02.748459 4803 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d\": container with ID starting with 398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d not found: ID does not exist" containerID="398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.748507 4803 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d"} err="failed to get container status \"398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d\": rpc error: code = NotFound desc = could not find container \"398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d\": container with ID starting with 398ef7055ad13117d941a1b0fd2d2f5f8a88be6ebff7437863703ff653b5765d not found: ID does not exist" Mar 20 18:27:02 crc kubenswrapper[4803]: I0320 18:27:02.858831 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11de5097-05f5-47c5-a17f-cb331c42fa58" path="/var/lib/kubelet/pods/11de5097-05f5-47c5-a17f-cb331c42fa58/volumes" Mar 20 18:27:41 crc kubenswrapper[4803]: I0320 18:27:41.958049 4803 scope.go:117] "RemoveContainer" containerID="1c2a0542c608ec433df9145639ddd273d3cdd7b87f40a7194f544159a08861d2" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.156983 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567188-44fn9"] Mar 20 18:28:00 crc kubenswrapper[4803]: E0320 18:28:00.159203 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d701028b-b006-498e-98eb-cb8b73759e6f" containerName="oc" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.159221 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="d701028b-b006-498e-98eb-cb8b73759e6f" containerName="oc" Mar 20 18:28:00 crc kubenswrapper[4803]: E0320 18:28:00.159234 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerName="gather" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.159240 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerName="gather" Mar 20 18:28:00 crc kubenswrapper[4803]: E0320 18:28:00.159288 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerName="copy" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.159295 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerName="copy" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.159490 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerName="gather" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.159547 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="11de5097-05f5-47c5-a17f-cb331c42fa58" containerName="copy" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.159564 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="d701028b-b006-498e-98eb-cb8b73759e6f" containerName="oc" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.160329 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-44fn9" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.162962 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.163183 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.163124 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.193564 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-44fn9"] Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.293315 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x5gz\" (UniqueName: \"kubernetes.io/projected/326be08a-7a34-4d44-826b-8f7adb66e184-kube-api-access-9x5gz\") pod \"auto-csr-approver-29567188-44fn9\" (UID: \"326be08a-7a34-4d44-826b-8f7adb66e184\") " pod="openshift-infra/auto-csr-approver-29567188-44fn9" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.396281 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x5gz\" (UniqueName: \"kubernetes.io/projected/326be08a-7a34-4d44-826b-8f7adb66e184-kube-api-access-9x5gz\") pod \"auto-csr-approver-29567188-44fn9\" (UID: \"326be08a-7a34-4d44-826b-8f7adb66e184\") " pod="openshift-infra/auto-csr-approver-29567188-44fn9" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.418192 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x5gz\" (UniqueName: \"kubernetes.io/projected/326be08a-7a34-4d44-826b-8f7adb66e184-kube-api-access-9x5gz\") pod \"auto-csr-approver-29567188-44fn9\" (UID: \"326be08a-7a34-4d44-826b-8f7adb66e184\") " pod="openshift-infra/auto-csr-approver-29567188-44fn9" Mar 20 18:28:00 crc kubenswrapper[4803]: I0320 18:28:00.483674 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-44fn9" Mar 20 18:28:01 crc kubenswrapper[4803]: I0320 18:28:01.004937 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567188-44fn9"] Mar 20 18:28:01 crc kubenswrapper[4803]: I0320 18:28:01.025226 4803 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 18:28:01 crc kubenswrapper[4803]: I0320 18:28:01.260720 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-44fn9" event={"ID":"326be08a-7a34-4d44-826b-8f7adb66e184","Type":"ContainerStarted","Data":"031cd7e7a1a21cb8aecab0e97db25799c7281c220f34eaff80603ebd576c5ca3"} Mar 20 18:28:04 crc kubenswrapper[4803]: I0320 18:28:04.311596 4803 generic.go:334] "Generic (PLEG): container finished" podID="326be08a-7a34-4d44-826b-8f7adb66e184" containerID="7adb671de3e3e7989fe5a198b3d50d84c80c516092e1aed59604fbc2711dd043" exitCode=0 Mar 20 18:28:04 crc kubenswrapper[4803]: I0320 18:28:04.311698 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-44fn9" event={"ID":"326be08a-7a34-4d44-826b-8f7adb66e184","Type":"ContainerDied","Data":"7adb671de3e3e7989fe5a198b3d50d84c80c516092e1aed59604fbc2711dd043"} Mar 20 18:28:05 crc kubenswrapper[4803]: I0320 18:28:05.705997 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-44fn9" Mar 20 18:28:05 crc kubenswrapper[4803]: I0320 18:28:05.902950 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x5gz\" (UniqueName: \"kubernetes.io/projected/326be08a-7a34-4d44-826b-8f7adb66e184-kube-api-access-9x5gz\") pod \"326be08a-7a34-4d44-826b-8f7adb66e184\" (UID: \"326be08a-7a34-4d44-826b-8f7adb66e184\") " Mar 20 18:28:05 crc kubenswrapper[4803]: I0320 18:28:05.913967 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/326be08a-7a34-4d44-826b-8f7adb66e184-kube-api-access-9x5gz" (OuterVolumeSpecName: "kube-api-access-9x5gz") pod "326be08a-7a34-4d44-826b-8f7adb66e184" (UID: "326be08a-7a34-4d44-826b-8f7adb66e184"). InnerVolumeSpecName "kube-api-access-9x5gz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:28:06 crc kubenswrapper[4803]: I0320 18:28:06.006676 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x5gz\" (UniqueName: \"kubernetes.io/projected/326be08a-7a34-4d44-826b-8f7adb66e184-kube-api-access-9x5gz\") on node \"crc\" DevicePath \"\"" Mar 20 18:28:06 crc kubenswrapper[4803]: I0320 18:28:06.333675 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567188-44fn9" event={"ID":"326be08a-7a34-4d44-826b-8f7adb66e184","Type":"ContainerDied","Data":"031cd7e7a1a21cb8aecab0e97db25799c7281c220f34eaff80603ebd576c5ca3"} Mar 20 18:28:06 crc kubenswrapper[4803]: I0320 18:28:06.333724 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031cd7e7a1a21cb8aecab0e97db25799c7281c220f34eaff80603ebd576c5ca3" Mar 20 18:28:06 crc kubenswrapper[4803]: I0320 18:28:06.333781 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567188-44fn9" Mar 20 18:28:06 crc kubenswrapper[4803]: I0320 18:28:06.796987 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-52597"] Mar 20 18:28:06 crc kubenswrapper[4803]: I0320 18:28:06.807794 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567182-52597"] Mar 20 18:28:06 crc kubenswrapper[4803]: I0320 18:28:06.865614 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64" path="/var/lib/kubelet/pods/ebfb5d07-87d3-4a25-a8e8-5689fe5d9d64/volumes" Mar 20 18:28:08 crc kubenswrapper[4803]: I0320 18:28:08.246784 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:28:08 crc kubenswrapper[4803]: I0320 18:28:08.247232 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:28:38 crc kubenswrapper[4803]: I0320 18:28:38.246188 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:28:38 crc kubenswrapper[4803]: I0320 18:28:38.246884 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:28:42 crc kubenswrapper[4803]: I0320 18:28:42.066194 4803 scope.go:117] "RemoveContainer" containerID="e241cc7525607ecf1217ca5ce1cb4dac8ab78cf8efc4a9e899746cb2fdbfdabd" Mar 20 18:29:08 crc kubenswrapper[4803]: I0320 18:29:08.245623 4803 patch_prober.go:28] interesting pod/machine-config-daemon-26nll container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 18:29:08 crc kubenswrapper[4803]: I0320 18:29:08.246241 4803 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 18:29:08 crc kubenswrapper[4803]: I0320 18:29:08.246295 4803 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-26nll" Mar 20 18:29:08 crc kubenswrapper[4803]: I0320 18:29:08.247107 4803 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789"} pod="openshift-machine-config-operator/machine-config-daemon-26nll" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 18:29:08 crc kubenswrapper[4803]: I0320 18:29:08.247172 4803 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" containerName="machine-config-daemon" containerID="cri-o://08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789" gracePeriod=600 Mar 20 18:29:08 crc kubenswrapper[4803]: E0320 18:29:08.886500 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:29:09 crc kubenswrapper[4803]: I0320 18:29:09.068079 4803 generic.go:334] "Generic (PLEG): container finished" podID="8510a852-14e1-4aba-826c-de9d4cfac290" containerID="08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789" exitCode=0 Mar 20 18:29:09 crc kubenswrapper[4803]: I0320 18:29:09.068250 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-26nll" event={"ID":"8510a852-14e1-4aba-826c-de9d4cfac290","Type":"ContainerDied","Data":"08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789"} Mar 20 18:29:09 crc kubenswrapper[4803]: I0320 18:29:09.068574 4803 scope.go:117] "RemoveContainer" containerID="2fdd16f5246edd6c5e9048a1a65ae17cc95dfdb32f5019a55bd0a73d20083db4" Mar 20 18:29:09 crc kubenswrapper[4803]: I0320 18:29:09.069265 4803 scope.go:117] "RemoveContainer" containerID="08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789" Mar 20 18:29:09 crc kubenswrapper[4803]: E0320 18:29:09.069545 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:29:21 crc kubenswrapper[4803]: I0320 18:29:21.848542 4803 scope.go:117] "RemoveContainer" containerID="08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789" Mar 20 18:29:21 crc kubenswrapper[4803]: E0320 18:29:21.849538 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:29:34 crc kubenswrapper[4803]: I0320 18:29:34.848822 4803 scope.go:117] "RemoveContainer" containerID="08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789" Mar 20 18:29:34 crc kubenswrapper[4803]: E0320 18:29:34.849570 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:29:45 crc kubenswrapper[4803]: I0320 18:29:45.848892 4803 scope.go:117] "RemoveContainer" containerID="08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789" Mar 20 18:29:45 crc kubenswrapper[4803]: E0320 18:29:45.849623 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.171581 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29567190-rnw62"] Mar 20 18:30:00 crc kubenswrapper[4803]: E0320 18:30:00.174354 4803 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="326be08a-7a34-4d44-826b-8f7adb66e184" containerName="oc" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.174397 4803 state_mem.go:107] "Deleted CPUSet assignment" podUID="326be08a-7a34-4d44-826b-8f7adb66e184" containerName="oc" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.174852 4803 memory_manager.go:354] "RemoveStaleState removing state" podUID="326be08a-7a34-4d44-826b-8f7adb66e184" containerName="oc" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.175638 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-rnw62" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.178276 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mrzfh" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.178486 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.178609 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.185707 4803 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm"] Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.187703 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.190341 4803 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.195795 4803 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.207396 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-rnw62"] Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.224202 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm"] Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.336051 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-secret-volume\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.336141 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmjdm\" (UniqueName: \"kubernetes.io/projected/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-kube-api-access-mmjdm\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.336175 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-config-volume\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.336505 4803 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5btbj\" (UniqueName: \"kubernetes.io/projected/d3b2e71e-a5c9-400f-b665-4ea9840002f2-kube-api-access-5btbj\") pod \"auto-csr-approver-29567190-rnw62\" (UID: \"d3b2e71e-a5c9-400f-b665-4ea9840002f2\") " pod="openshift-infra/auto-csr-approver-29567190-rnw62" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.439012 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5btbj\" (UniqueName: \"kubernetes.io/projected/d3b2e71e-a5c9-400f-b665-4ea9840002f2-kube-api-access-5btbj\") pod \"auto-csr-approver-29567190-rnw62\" (UID: \"d3b2e71e-a5c9-400f-b665-4ea9840002f2\") " pod="openshift-infra/auto-csr-approver-29567190-rnw62" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.439251 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-secret-volume\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.439423 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmjdm\" (UniqueName: \"kubernetes.io/projected/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-kube-api-access-mmjdm\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.439477 4803 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-config-volume\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.441507 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-config-volume\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.453545 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-secret-volume\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.454708 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5btbj\" (UniqueName: \"kubernetes.io/projected/d3b2e71e-a5c9-400f-b665-4ea9840002f2-kube-api-access-5btbj\") pod \"auto-csr-approver-29567190-rnw62\" (UID: \"d3b2e71e-a5c9-400f-b665-4ea9840002f2\") " pod="openshift-infra/auto-csr-approver-29567190-rnw62" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.455991 4803 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmjdm\" (UniqueName: \"kubernetes.io/projected/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-kube-api-access-mmjdm\") pod \"collect-profiles-29567190-zrfkm\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.510514 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-rnw62" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.522038 4803 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.854160 4803 scope.go:117] "RemoveContainer" containerID="08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789" Mar 20 18:30:00 crc kubenswrapper[4803]: E0320 18:30:00.855206 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290" Mar 20 18:30:00 crc kubenswrapper[4803]: I0320 18:30:00.997680 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29567190-rnw62"] Mar 20 18:30:01 crc kubenswrapper[4803]: I0320 18:30:01.059303 4803 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm"] Mar 20 18:30:01 crc kubenswrapper[4803]: W0320 18:30:01.062491 4803 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44ddc5d9_3fc4_4fb0_a4f4_03b27188769b.slice/crio-65a85cdde8a611db70abf3b542328a8fbcbaf90b4df5fc7d0c6573d3c780c459 WatchSource:0}: Error finding container 65a85cdde8a611db70abf3b542328a8fbcbaf90b4df5fc7d0c6573d3c780c459: Status 404 returned error can't find the container with id 65a85cdde8a611db70abf3b542328a8fbcbaf90b4df5fc7d0c6573d3c780c459 Mar 20 18:30:01 crc kubenswrapper[4803]: I0320 18:30:01.628781 4803 generic.go:334] "Generic (PLEG): container finished" podID="44ddc5d9-3fc4-4fb0-a4f4-03b27188769b" containerID="4c22c065ff0cf00f67096304a93645c3c27948d434349b6d93b364f7e99c14ca" exitCode=0 Mar 20 18:30:01 crc kubenswrapper[4803]: I0320 18:30:01.629148 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" event={"ID":"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b","Type":"ContainerDied","Data":"4c22c065ff0cf00f67096304a93645c3c27948d434349b6d93b364f7e99c14ca"} Mar 20 18:30:01 crc kubenswrapper[4803]: I0320 18:30:01.629181 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" event={"ID":"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b","Type":"ContainerStarted","Data":"65a85cdde8a611db70abf3b542328a8fbcbaf90b4df5fc7d0c6573d3c780c459"} Mar 20 18:30:01 crc kubenswrapper[4803]: I0320 18:30:01.630744 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-rnw62" event={"ID":"d3b2e71e-a5c9-400f-b665-4ea9840002f2","Type":"ContainerStarted","Data":"aef681aeeb5646242a6722c9cbb115d75a5efa2b4de9b046d59c20869317e0a2"} Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.059122 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.208264 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmjdm\" (UniqueName: \"kubernetes.io/projected/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-kube-api-access-mmjdm\") pod \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.208626 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-secret-volume\") pod \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.208713 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-config-volume\") pod \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\" (UID: \"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b\") " Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.209168 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-config-volume" (OuterVolumeSpecName: "config-volume") pod "44ddc5d9-3fc4-4fb0-a4f4-03b27188769b" (UID: "44ddc5d9-3fc4-4fb0-a4f4-03b27188769b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.209409 4803 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.214553 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "44ddc5d9-3fc4-4fb0-a4f4-03b27188769b" (UID: "44ddc5d9-3fc4-4fb0-a4f4-03b27188769b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.215125 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-kube-api-access-mmjdm" (OuterVolumeSpecName: "kube-api-access-mmjdm") pod "44ddc5d9-3fc4-4fb0-a4f4-03b27188769b" (UID: "44ddc5d9-3fc4-4fb0-a4f4-03b27188769b"). InnerVolumeSpecName "kube-api-access-mmjdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.311963 4803 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.312023 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmjdm\" (UniqueName: \"kubernetes.io/projected/44ddc5d9-3fc4-4fb0-a4f4-03b27188769b-kube-api-access-mmjdm\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.655771 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.655772 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29567190-zrfkm" event={"ID":"44ddc5d9-3fc4-4fb0-a4f4-03b27188769b","Type":"ContainerDied","Data":"65a85cdde8a611db70abf3b542328a8fbcbaf90b4df5fc7d0c6573d3c780c459"} Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.655834 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65a85cdde8a611db70abf3b542328a8fbcbaf90b4df5fc7d0c6573d3c780c459" Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.658643 4803 generic.go:334] "Generic (PLEG): container finished" podID="d3b2e71e-a5c9-400f-b665-4ea9840002f2" containerID="eeeb88e1743dfbd0d2627e25021564d4b892211653f50da64cbc56635a1811a5" exitCode=0 Mar 20 18:30:03 crc kubenswrapper[4803]: I0320 18:30:03.658701 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-rnw62" event={"ID":"d3b2e71e-a5c9-400f-b665-4ea9840002f2","Type":"ContainerDied","Data":"eeeb88e1743dfbd0d2627e25021564d4b892211653f50da64cbc56635a1811a5"} Mar 20 18:30:04 crc kubenswrapper[4803]: I0320 18:30:04.148502 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6"] Mar 20 18:30:04 crc kubenswrapper[4803]: I0320 18:30:04.162955 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29567145-wwwq6"] Mar 20 18:30:04 crc kubenswrapper[4803]: I0320 18:30:04.869360 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="800f6249-746f-4ac2-9bab-30998a996ac2" path="/var/lib/kubelet/pods/800f6249-746f-4ac2-9bab-30998a996ac2/volumes" Mar 20 18:30:05 crc kubenswrapper[4803]: I0320 18:30:05.006486 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-rnw62" Mar 20 18:30:05 crc kubenswrapper[4803]: I0320 18:30:05.152476 4803 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5btbj\" (UniqueName: \"kubernetes.io/projected/d3b2e71e-a5c9-400f-b665-4ea9840002f2-kube-api-access-5btbj\") pod \"d3b2e71e-a5c9-400f-b665-4ea9840002f2\" (UID: \"d3b2e71e-a5c9-400f-b665-4ea9840002f2\") " Mar 20 18:30:05 crc kubenswrapper[4803]: I0320 18:30:05.158234 4803 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b2e71e-a5c9-400f-b665-4ea9840002f2-kube-api-access-5btbj" (OuterVolumeSpecName: "kube-api-access-5btbj") pod "d3b2e71e-a5c9-400f-b665-4ea9840002f2" (UID: "d3b2e71e-a5c9-400f-b665-4ea9840002f2"). InnerVolumeSpecName "kube-api-access-5btbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 18:30:05 crc kubenswrapper[4803]: I0320 18:30:05.254947 4803 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5btbj\" (UniqueName: \"kubernetes.io/projected/d3b2e71e-a5c9-400f-b665-4ea9840002f2-kube-api-access-5btbj\") on node \"crc\" DevicePath \"\"" Mar 20 18:30:05 crc kubenswrapper[4803]: I0320 18:30:05.681819 4803 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29567190-rnw62" event={"ID":"d3b2e71e-a5c9-400f-b665-4ea9840002f2","Type":"ContainerDied","Data":"aef681aeeb5646242a6722c9cbb115d75a5efa2b4de9b046d59c20869317e0a2"} Mar 20 18:30:05 crc kubenswrapper[4803]: I0320 18:30:05.681861 4803 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aef681aeeb5646242a6722c9cbb115d75a5efa2b4de9b046d59c20869317e0a2" Mar 20 18:30:05 crc kubenswrapper[4803]: I0320 18:30:05.681871 4803 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29567190-rnw62" Mar 20 18:30:06 crc kubenswrapper[4803]: I0320 18:30:06.128757 4803 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-5jg5z"] Mar 20 18:30:06 crc kubenswrapper[4803]: I0320 18:30:06.139730 4803 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29567184-5jg5z"] Mar 20 18:30:06 crc kubenswrapper[4803]: I0320 18:30:06.870569 4803 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637b5048-6215-47ef-ac6a-454784740aab" path="/var/lib/kubelet/pods/637b5048-6215-47ef-ac6a-454784740aab/volumes" Mar 20 18:30:11 crc kubenswrapper[4803]: I0320 18:30:11.848016 4803 scope.go:117] "RemoveContainer" containerID="08ce85dbe0a9907ae7809be1336bdaff328939b6819af69de632bd2ff86ef789" Mar 20 18:30:11 crc kubenswrapper[4803]: E0320 18:30:11.849085 4803 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-26nll_openshift-machine-config-operator(8510a852-14e1-4aba-826c-de9d4cfac290)\"" pod="openshift-machine-config-operator/machine-config-daemon-26nll" podUID="8510a852-14e1-4aba-826c-de9d4cfac290"